Nov 24 01:08:55 localhost kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 24 01:08:55 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 24 01:08:55 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 01:08:55 localhost kernel: BIOS-provided physical RAM map:
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 24 01:08:55 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 24 01:08:55 localhost kernel: NX (Execute Disable) protection: active
Nov 24 01:08:55 localhost kernel: APIC: Static calls initialized
Nov 24 01:08:55 localhost kernel: SMBIOS 2.8 present.
Nov 24 01:08:55 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 24 01:08:55 localhost kernel: Hypervisor detected: KVM
Nov 24 01:08:55 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 24 01:08:55 localhost kernel: kvm-clock: using sched offset of 4373707850 cycles
Nov 24 01:08:55 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 24 01:08:55 localhost kernel: tsc: Detected 2800.000 MHz processor
Nov 24 01:08:55 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 24 01:08:55 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 24 01:08:55 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 24 01:08:55 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 24 01:08:55 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 24 01:08:55 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 24 01:08:55 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 24 01:08:55 localhost kernel: Using GB pages for direct mapping
Nov 24 01:08:55 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 24 01:08:55 localhost kernel: ACPI: Early table checksum verification disabled
Nov 24 01:08:55 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 24 01:08:55 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 01:08:55 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 01:08:55 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 01:08:55 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 24 01:08:55 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 01:08:55 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 01:08:55 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 24 01:08:55 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 24 01:08:55 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 24 01:08:55 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 24 01:08:55 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 24 01:08:55 localhost kernel: No NUMA configuration found
Nov 24 01:08:55 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 24 01:08:55 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 24 01:08:55 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 24 01:08:55 localhost kernel: Zone ranges:
Nov 24 01:08:55 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 24 01:08:55 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 24 01:08:55 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 01:08:55 localhost kernel:   Device   empty
Nov 24 01:08:55 localhost kernel: Movable zone start for each node
Nov 24 01:08:55 localhost kernel: Early memory node ranges
Nov 24 01:08:55 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 24 01:08:55 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 24 01:08:55 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 01:08:55 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 24 01:08:55 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 24 01:08:55 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 24 01:08:55 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 24 01:08:55 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 24 01:08:55 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 24 01:08:55 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 24 01:08:55 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 24 01:08:55 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 24 01:08:55 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 24 01:08:55 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 24 01:08:55 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 24 01:08:55 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 24 01:08:55 localhost kernel: TSC deadline timer available
Nov 24 01:08:55 localhost kernel: CPU topo: Max. logical packages:   8
Nov 24 01:08:55 localhost kernel: CPU topo: Max. logical dies:       8
Nov 24 01:08:55 localhost kernel: CPU topo: Max. dies per package:   1
Nov 24 01:08:55 localhost kernel: CPU topo: Max. threads per core:   1
Nov 24 01:08:55 localhost kernel: CPU topo: Num. cores per package:     1
Nov 24 01:08:55 localhost kernel: CPU topo: Num. threads per package:   1
Nov 24 01:08:55 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 24 01:08:55 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 24 01:08:55 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 24 01:08:55 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 24 01:08:55 localhost kernel: Booting paravirtualized kernel on KVM
Nov 24 01:08:55 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 24 01:08:55 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 24 01:08:55 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 24 01:08:55 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 24 01:08:55 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 24 01:08:55 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 24 01:08:55 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 01:08:55 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 24 01:08:55 localhost kernel: random: crng init done
Nov 24 01:08:55 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 24 01:08:55 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 24 01:08:55 localhost kernel: Fallback order for Node 0: 0 
Nov 24 01:08:55 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 24 01:08:55 localhost kernel: Policy zone: Normal
Nov 24 01:08:55 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 24 01:08:55 localhost kernel: software IO TLB: area num 8.
Nov 24 01:08:55 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 24 01:08:55 localhost kernel: ftrace: allocating 49298 entries in 193 pages
Nov 24 01:08:55 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 24 01:08:55 localhost kernel: Dynamic Preempt: voluntary
Nov 24 01:08:55 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 24 01:08:55 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 24 01:08:55 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 24 01:08:55 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 24 01:08:55 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 24 01:08:55 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 24 01:08:55 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 24 01:08:55 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 24 01:08:55 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 01:08:55 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 01:08:55 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 01:08:55 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 24 01:08:55 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 24 01:08:55 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 24 01:08:55 localhost kernel: Console: colour VGA+ 80x25
Nov 24 01:08:55 localhost kernel: printk: console [ttyS0] enabled
Nov 24 01:08:55 localhost kernel: ACPI: Core revision 20230331
Nov 24 01:08:55 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 24 01:08:55 localhost kernel: x2apic enabled
Nov 24 01:08:55 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 24 01:08:55 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 24 01:08:55 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 24 01:08:55 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 24 01:08:55 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 24 01:08:55 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 24 01:08:55 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 24 01:08:55 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 24 01:08:55 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 24 01:08:55 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 24 01:08:55 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 24 01:08:55 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 24 01:08:55 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 24 01:08:55 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 24 01:08:55 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 24 01:08:55 localhost kernel: x86/bugs: return thunk changed
Nov 24 01:08:55 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 24 01:08:55 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 24 01:08:55 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 24 01:08:55 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 24 01:08:55 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 24 01:08:55 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 24 01:08:55 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 24 01:08:55 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 24 01:08:55 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 24 01:08:55 localhost kernel: landlock: Up and running.
Nov 24 01:08:55 localhost kernel: Yama: becoming mindful.
Nov 24 01:08:55 localhost kernel: SELinux:  Initializing.
Nov 24 01:08:55 localhost kernel: LSM support for eBPF active
Nov 24 01:08:55 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 01:08:55 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 01:08:55 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 24 01:08:55 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 24 01:08:55 localhost kernel: ... version:                0
Nov 24 01:08:55 localhost kernel: ... bit width:              48
Nov 24 01:08:55 localhost kernel: ... generic registers:      6
Nov 24 01:08:55 localhost kernel: ... value mask:             0000ffffffffffff
Nov 24 01:08:55 localhost kernel: ... max period:             00007fffffffffff
Nov 24 01:08:55 localhost kernel: ... fixed-purpose events:   0
Nov 24 01:08:55 localhost kernel: ... event mask:             000000000000003f
Nov 24 01:08:55 localhost kernel: signal: max sigframe size: 1776
Nov 24 01:08:55 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 24 01:08:55 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 24 01:08:55 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 24 01:08:55 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 24 01:08:55 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 24 01:08:55 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 24 01:08:55 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 24 01:08:55 localhost kernel: node 0 deferred pages initialised in 10ms
Nov 24 01:08:55 localhost kernel: Memory: 7765928K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616268K reserved, 0K cma-reserved)
Nov 24 01:08:55 localhost kernel: devtmpfs: initialized
Nov 24 01:08:55 localhost kernel: x86/mm: Memory block size: 128MB
Nov 24 01:08:55 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 24 01:08:55 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 24 01:08:55 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 24 01:08:55 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 24 01:08:55 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 24 01:08:55 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 24 01:08:55 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 24 01:08:55 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 24 01:08:55 localhost kernel: audit: type=2000 audit(1763946533.120:1): state=initialized audit_enabled=0 res=1
Nov 24 01:08:55 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 24 01:08:55 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 24 01:08:55 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 24 01:08:55 localhost kernel: cpuidle: using governor menu
Nov 24 01:08:55 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 24 01:08:55 localhost kernel: PCI: Using configuration type 1 for base access
Nov 24 01:08:55 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 24 01:08:55 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 24 01:08:55 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 24 01:08:55 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 24 01:08:55 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 24 01:08:55 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 24 01:08:55 localhost kernel: Demotion targets for Node 0: null
Nov 24 01:08:55 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 24 01:08:55 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 24 01:08:55 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 24 01:08:55 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 24 01:08:55 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 24 01:08:55 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 24 01:08:55 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 24 01:08:55 localhost kernel: ACPI: Interpreter enabled
Nov 24 01:08:55 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 24 01:08:55 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 24 01:08:55 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 24 01:08:55 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 24 01:08:55 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 24 01:08:55 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 24 01:08:55 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [3] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [4] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [5] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [6] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [7] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [8] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [9] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [10] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [11] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [12] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [13] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [14] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [15] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [16] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [17] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [18] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [19] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [20] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [21] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [22] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [23] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [24] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [25] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [26] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [27] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [28] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [29] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [30] registered
Nov 24 01:08:55 localhost kernel: acpiphp: Slot [31] registered
Nov 24 01:08:55 localhost kernel: PCI host bridge to bus 0000:00
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 24 01:08:55 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 24 01:08:55 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 24 01:08:55 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 01:08:55 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 24 01:08:55 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 24 01:08:55 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 24 01:08:55 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 24 01:08:55 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 24 01:08:55 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 24 01:08:55 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 24 01:08:55 localhost kernel: iommu: Default domain type: Translated
Nov 24 01:08:55 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 24 01:08:55 localhost kernel: SCSI subsystem initialized
Nov 24 01:08:55 localhost kernel: ACPI: bus type USB registered
Nov 24 01:08:55 localhost kernel: usbcore: registered new interface driver usbfs
Nov 24 01:08:55 localhost kernel: usbcore: registered new interface driver hub
Nov 24 01:08:55 localhost kernel: usbcore: registered new device driver usb
Nov 24 01:08:55 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 24 01:08:55 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 24 01:08:55 localhost kernel: PTP clock support registered
Nov 24 01:08:55 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 24 01:08:55 localhost kernel: NetLabel: Initializing
Nov 24 01:08:55 localhost kernel: NetLabel:  domain hash size = 128
Nov 24 01:08:55 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 24 01:08:55 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 24 01:08:55 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 24 01:08:55 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 24 01:08:55 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 24 01:08:55 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 24 01:08:55 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 24 01:08:55 localhost kernel: vgaarb: loaded
Nov 24 01:08:55 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 24 01:08:55 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 24 01:08:55 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 24 01:08:55 localhost kernel: pnp: PnP ACPI init
Nov 24 01:08:55 localhost kernel: pnp 00:03: [dma 2]
Nov 24 01:08:55 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 24 01:08:55 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 24 01:08:55 localhost kernel: NET: Registered PF_INET protocol family
Nov 24 01:08:55 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 24 01:08:55 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 24 01:08:55 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 24 01:08:55 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 24 01:08:55 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 24 01:08:55 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 24 01:08:55 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 24 01:08:55 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 01:08:55 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 01:08:55 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 24 01:08:55 localhost kernel: NET: Registered PF_XDP protocol family
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 24 01:08:55 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 24 01:08:55 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 24 01:08:55 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 24 01:08:55 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 112819 usecs
Nov 24 01:08:55 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 24 01:08:55 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 24 01:08:55 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 24 01:08:55 localhost kernel: ACPI: bus type thunderbolt registered
Nov 24 01:08:55 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 24 01:08:55 localhost kernel: Initialise system trusted keyrings
Nov 24 01:08:55 localhost kernel: Key type blacklist registered
Nov 24 01:08:55 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 24 01:08:55 localhost kernel: zbud: loaded
Nov 24 01:08:55 localhost kernel: integrity: Platform Keyring initialized
Nov 24 01:08:55 localhost kernel: integrity: Machine keyring initialized
Nov 24 01:08:55 localhost kernel: Freeing initrd memory: 85868K
Nov 24 01:08:55 localhost kernel: NET: Registered PF_ALG protocol family
Nov 24 01:08:55 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 24 01:08:55 localhost kernel: Key type asymmetric registered
Nov 24 01:08:55 localhost kernel: Asymmetric key parser 'x509' registered
Nov 24 01:08:55 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 24 01:08:55 localhost kernel: io scheduler mq-deadline registered
Nov 24 01:08:55 localhost kernel: io scheduler kyber registered
Nov 24 01:08:55 localhost kernel: io scheduler bfq registered
Nov 24 01:08:55 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 24 01:08:55 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 24 01:08:55 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 24 01:08:55 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 24 01:08:55 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 24 01:08:55 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 24 01:08:55 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 24 01:08:55 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 24 01:08:55 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 24 01:08:55 localhost kernel: Non-volatile memory driver v1.3
Nov 24 01:08:55 localhost kernel: rdac: device handler registered
Nov 24 01:08:55 localhost kernel: hp_sw: device handler registered
Nov 24 01:08:55 localhost kernel: emc: device handler registered
Nov 24 01:08:55 localhost kernel: alua: device handler registered
Nov 24 01:08:55 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 24 01:08:55 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 24 01:08:55 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 24 01:08:55 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 24 01:08:55 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 24 01:08:55 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 24 01:08:55 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 24 01:08:55 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 24 01:08:55 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 24 01:08:55 localhost kernel: hub 1-0:1.0: USB hub found
Nov 24 01:08:55 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 24 01:08:55 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 24 01:08:55 localhost kernel: usbserial: USB Serial support registered for generic
Nov 24 01:08:55 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 24 01:08:55 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 24 01:08:55 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 24 01:08:55 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 24 01:08:55 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 24 01:08:55 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 24 01:08:55 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-24T01:08:54 UTC (1763946534)
Nov 24 01:08:55 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 24 01:08:55 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 24 01:08:55 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 24 01:08:55 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 24 01:08:55 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 24 01:08:55 localhost kernel: usbcore: registered new interface driver usbhid
Nov 24 01:08:55 localhost kernel: usbhid: USB HID core driver
Nov 24 01:08:55 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 24 01:08:55 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 24 01:08:55 localhost kernel: Initializing XFRM netlink socket
Nov 24 01:08:55 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 24 01:08:55 localhost kernel: Segment Routing with IPv6
Nov 24 01:08:55 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 24 01:08:55 localhost kernel: mpls_gso: MPLS GSO support
Nov 24 01:08:55 localhost kernel: IPI shorthand broadcast: enabled
Nov 24 01:08:55 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 24 01:08:55 localhost kernel: AES CTR mode by8 optimization enabled
Nov 24 01:08:55 localhost kernel: sched_clock: Marking stable (1256002810, 143484130)->(1524907980, -125421040)
Nov 24 01:08:55 localhost kernel: registered taskstats version 1
Nov 24 01:08:55 localhost kernel: Loading compiled-in X.509 certificates
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 24 01:08:55 localhost kernel: Demotion targets for Node 0: null
Nov 24 01:08:55 localhost kernel: page_owner is disabled
Nov 24 01:08:55 localhost kernel: Key type .fscrypt registered
Nov 24 01:08:55 localhost kernel: Key type fscrypt-provisioning registered
Nov 24 01:08:55 localhost kernel: Key type big_key registered
Nov 24 01:08:55 localhost kernel: Key type encrypted registered
Nov 24 01:08:55 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 24 01:08:55 localhost kernel: Loading compiled-in module X.509 certificates
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 01:08:55 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 24 01:08:55 localhost kernel: ima: No architecture policies found
Nov 24 01:08:55 localhost kernel: evm: Initialising EVM extended attributes:
Nov 24 01:08:55 localhost kernel: evm: security.selinux
Nov 24 01:08:55 localhost kernel: evm: security.SMACK64 (disabled)
Nov 24 01:08:55 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 24 01:08:55 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 24 01:08:55 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 24 01:08:55 localhost kernel: evm: security.apparmor (disabled)
Nov 24 01:08:55 localhost kernel: evm: security.ima
Nov 24 01:08:55 localhost kernel: evm: security.capability
Nov 24 01:08:55 localhost kernel: evm: HMAC attrs: 0x1
Nov 24 01:08:55 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 24 01:08:55 localhost kernel: Running certificate verification RSA selftest
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 24 01:08:55 localhost kernel: Running certificate verification ECDSA selftest
Nov 24 01:08:55 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 24 01:08:55 localhost kernel: clk: Disabling unused clocks
Nov 24 01:08:55 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 24 01:08:55 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 24 01:08:55 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 24 01:08:55 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 24 01:08:55 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 24 01:08:55 localhost kernel: Run /init as init process
Nov 24 01:08:55 localhost kernel:   with arguments:
Nov 24 01:08:55 localhost kernel:     /init
Nov 24 01:08:55 localhost kernel:   with environment:
Nov 24 01:08:55 localhost kernel:     HOME=/
Nov 24 01:08:55 localhost kernel:     TERM=linux
Nov 24 01:08:55 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64
Nov 24 01:08:55 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 24 01:08:55 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 24 01:08:55 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 24 01:08:55 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 24 01:08:55 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 24 01:08:55 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 01:08:55 localhost systemd[1]: Detected virtualization kvm.
Nov 24 01:08:55 localhost systemd[1]: Detected architecture x86-64.
Nov 24 01:08:55 localhost systemd[1]: Running in initrd.
Nov 24 01:08:55 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 24 01:08:55 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 24 01:08:55 localhost systemd[1]: No hostname configured, using default hostname.
Nov 24 01:08:55 localhost systemd[1]: Hostname set to <localhost>.
Nov 24 01:08:55 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 24 01:08:55 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 24 01:08:55 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 01:08:55 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 01:08:55 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 24 01:08:55 localhost systemd[1]: Reached target Local File Systems.
Nov 24 01:08:55 localhost systemd[1]: Reached target Path Units.
Nov 24 01:08:55 localhost systemd[1]: Reached target Slice Units.
Nov 24 01:08:55 localhost systemd[1]: Reached target Swaps.
Nov 24 01:08:55 localhost systemd[1]: Reached target Timer Units.
Nov 24 01:08:55 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 01:08:55 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 24 01:08:55 localhost systemd[1]: Listening on Journal Socket.
Nov 24 01:08:55 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 01:08:55 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 01:08:55 localhost systemd[1]: Reached target Socket Units.
Nov 24 01:08:55 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 01:08:55 localhost systemd[1]: Starting Journal Service...
Nov 24 01:08:55 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 01:08:55 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 01:08:55 localhost systemd[1]: Starting Create System Users...
Nov 24 01:08:55 localhost systemd[1]: Starting Setup Virtual Console...
Nov 24 01:08:55 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 01:08:55 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 01:08:55 localhost systemd[1]: Finished Create System Users.
Nov 24 01:08:55 localhost systemd-journald[306]: Journal started
Nov 24 01:08:55 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/a7051cccfa00488d9432c0e2d4ac9648) is 8.0M, max 153.6M, 145.6M free.
Nov 24 01:08:55 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 24 01:08:55 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 24 01:08:55 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 24 01:08:55 localhost systemd[1]: Started Journal Service.
Nov 24 01:08:55 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 01:08:55 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 01:08:55 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 01:08:55 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 01:08:55 localhost systemd[1]: Finished Setup Virtual Console.
Nov 24 01:08:55 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 24 01:08:55 localhost systemd[1]: Starting dracut cmdline hook...
Nov 24 01:08:55 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Nov 24 01:08:55 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 01:08:55 localhost systemd[1]: Finished dracut cmdline hook.
Nov 24 01:08:55 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 24 01:08:55 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 24 01:08:55 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 24 01:08:55 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 24 01:08:55 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 24 01:08:55 localhost kernel: RPC: Registered udp transport module.
Nov 24 01:08:55 localhost kernel: RPC: Registered tcp transport module.
Nov 24 01:08:55 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 24 01:08:55 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 24 01:08:55 localhost rpc.statd[443]: Version 2.5.4 starting
Nov 24 01:08:55 localhost rpc.statd[443]: Initializing NSM state
Nov 24 01:08:55 localhost rpc.idmapd[448]: Setting log level to 0
Nov 24 01:08:55 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 24 01:08:55 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 01:08:55 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 01:08:55 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 01:08:55 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 24 01:08:55 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 24 01:08:55 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 01:08:56 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 24 01:08:56 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 01:08:56 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 01:08:56 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 01:08:56 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 01:08:56 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 24 01:08:56 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 01:08:56 localhost systemd[1]: Reached target Network.
Nov 24 01:08:56 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 01:08:56 localhost systemd[1]: Starting dracut initqueue hook...
Nov 24 01:08:56 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 24 01:08:56 localhost systemd[1]: Reached target System Initialization.
Nov 24 01:08:56 localhost systemd[1]: Reached target Basic System.
Nov 24 01:08:56 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 24 01:08:56 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 24 01:08:56 localhost kernel:  vda: vda1
Nov 24 01:08:56 localhost kernel: libata version 3.00 loaded.
Nov 24 01:08:56 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 24 01:08:56 localhost kernel: scsi host0: ata_piix
Nov 24 01:08:56 localhost kernel: scsi host1: ata_piix
Nov 24 01:08:56 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 24 01:08:56 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 24 01:08:56 localhost systemd-udevd[496]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:08:56 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 01:08:56 localhost systemd[1]: Reached target Initrd Root Device.
Nov 24 01:08:56 localhost kernel: ata1: found unknown device (class 0)
Nov 24 01:08:56 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 24 01:08:56 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 24 01:08:56 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 24 01:08:56 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 24 01:08:56 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 24 01:08:56 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 24 01:08:56 localhost systemd[1]: Finished dracut initqueue hook.
Nov 24 01:08:56 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 01:08:56 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 24 01:08:56 localhost systemd[1]: Reached target Remote File Systems.
Nov 24 01:08:56 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 24 01:08:56 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 24 01:08:56 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 24 01:08:56 localhost systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Nov 24 01:08:56 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 01:08:56 localhost systemd[1]: Mounting /sysroot...
Nov 24 01:08:57 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 24 01:08:57 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 24 01:08:57 localhost kernel: XFS (vda1): Ending clean mount
Nov 24 01:08:57 localhost systemd[1]: Mounted /sysroot.
Nov 24 01:08:57 localhost systemd[1]: Reached target Initrd Root File System.
Nov 24 01:08:57 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 24 01:08:57 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 24 01:08:57 localhost systemd[1]: Reached target Initrd File Systems.
Nov 24 01:08:57 localhost systemd[1]: Reached target Initrd Default Target.
Nov 24 01:08:57 localhost systemd[1]: Starting dracut mount hook...
Nov 24 01:08:57 localhost systemd[1]: Finished dracut mount hook.
Nov 24 01:08:57 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 24 01:08:57 localhost rpc.idmapd[448]: exiting on signal 15
Nov 24 01:08:57 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 24 01:08:57 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 24 01:08:57 localhost systemd[1]: Stopped target Network.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Timer Units.
Nov 24 01:08:57 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 24 01:08:57 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Basic System.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Path Units.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Remote File Systems.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Slice Units.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Socket Units.
Nov 24 01:08:57 localhost systemd[1]: Stopped target System Initialization.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Local File Systems.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Swaps.
Nov 24 01:08:57 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut mount hook.
Nov 24 01:08:57 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 24 01:08:57 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 24 01:08:57 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 24 01:08:57 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 24 01:08:57 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 24 01:08:57 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 24 01:08:57 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 24 01:08:57 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 24 01:08:57 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 24 01:08:57 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 24 01:08:57 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 24 01:08:57 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Closed udev Control Socket.
Nov 24 01:08:57 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Closed udev Kernel Socket.
Nov 24 01:08:57 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 24 01:08:57 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 24 01:08:57 localhost systemd[1]: Starting Cleanup udev Database...
Nov 24 01:08:57 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 24 01:08:57 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 24 01:08:57 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Stopped Create System Users.
Nov 24 01:08:57 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 24 01:08:57 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 24 01:08:57 localhost systemd[1]: Finished Cleanup udev Database.
Nov 24 01:08:57 localhost systemd[1]: Reached target Switch Root.
Nov 24 01:08:57 localhost systemd[1]: Starting Switch Root...
Nov 24 01:08:57 localhost systemd[1]: Switching root.
Nov 24 01:08:57 localhost systemd-journald[306]: Journal stopped
Nov 24 01:08:58 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Nov 24 01:08:58 localhost kernel: audit: type=1404 audit(1763946537.824:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability open_perms=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:08:58 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:08:58 localhost kernel: audit: type=1403 audit(1763946538.013:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 24 01:08:58 localhost systemd[1]: Successfully loaded SELinux policy in 192.289ms.
Nov 24 01:08:58 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.843ms.
Nov 24 01:08:58 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 01:08:58 localhost systemd[1]: Detected virtualization kvm.
Nov 24 01:08:58 localhost systemd[1]: Detected architecture x86-64.
Nov 24 01:08:58 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:08:58 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Stopped Switch Root.
Nov 24 01:08:58 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 24 01:08:58 localhost systemd[1]: Created slice Slice /system/getty.
Nov 24 01:08:58 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 24 01:08:58 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 24 01:08:58 localhost systemd[1]: Created slice User and Session Slice.
Nov 24 01:08:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 01:08:58 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 24 01:08:58 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 24 01:08:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 01:08:58 localhost systemd[1]: Stopped target Switch Root.
Nov 24 01:08:58 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 24 01:08:58 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 24 01:08:58 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 24 01:08:58 localhost systemd[1]: Reached target Path Units.
Nov 24 01:08:58 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 24 01:08:58 localhost systemd[1]: Reached target Slice Units.
Nov 24 01:08:58 localhost systemd[1]: Reached target Swaps.
Nov 24 01:08:58 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 24 01:08:58 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 24 01:08:58 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 24 01:08:58 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 24 01:08:58 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 24 01:08:58 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 01:08:58 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 01:08:58 localhost systemd[1]: Mounting Huge Pages File System...
Nov 24 01:08:58 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 24 01:08:58 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 24 01:08:58 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 24 01:08:58 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 01:08:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 01:08:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 01:08:58 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 24 01:08:58 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 24 01:08:58 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 24 01:08:58 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 24 01:08:58 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 24 01:08:58 localhost systemd[1]: Stopped Journal Service.
Nov 24 01:08:58 localhost systemd[1]: Starting Journal Service...
Nov 24 01:08:58 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 01:08:58 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 24 01:08:58 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 01:08:58 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 24 01:08:58 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 24 01:08:58 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 01:08:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 01:08:58 localhost kernel: fuse: init (API version 7.37)
Nov 24 01:08:58 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 24 01:08:58 localhost systemd[1]: Mounted Huge Pages File System.
Nov 24 01:08:58 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 24 01:08:58 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 24 01:08:58 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 24 01:08:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 01:08:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 01:08:58 localhost systemd-journald[678]: Journal started
Nov 24 01:08:58 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 01:08:58 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 24 01:08:58 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Started Journal Service.
Nov 24 01:08:58 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 24 01:08:58 localhost kernel: ACPI: bus type drm_connector registered
Nov 24 01:08:58 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 24 01:08:58 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 24 01:08:58 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 24 01:08:58 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 24 01:08:58 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 24 01:08:58 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 24 01:08:58 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 01:08:58 localhost systemd[1]: Mounting FUSE Control File System...
Nov 24 01:08:58 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 01:08:58 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 24 01:08:58 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 24 01:08:58 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 24 01:08:58 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 24 01:08:58 localhost systemd[1]: Starting Create System Users...
Nov 24 01:08:58 localhost systemd[1]: Mounted FUSE Control File System.
Nov 24 01:08:58 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 01:08:58 localhost systemd-journald[678]: Received client request to flush runtime journal.
Nov 24 01:08:58 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 24 01:08:58 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 24 01:08:58 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 01:08:58 localhost systemd[1]: Finished Create System Users.
Nov 24 01:08:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 01:08:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 01:08:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 01:08:58 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 24 01:08:58 localhost systemd[1]: Reached target Local File Systems.
Nov 24 01:08:58 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 24 01:08:58 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 24 01:08:58 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 24 01:08:58 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 24 01:08:58 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 24 01:08:58 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 24 01:08:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 01:08:58 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Nov 24 01:08:58 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 24 01:08:59 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 01:08:59 localhost systemd[1]: Starting Security Auditing Service...
Nov 24 01:08:59 localhost systemd[1]: Starting RPC Bind...
Nov 24 01:08:59 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 24 01:08:59 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 24 01:08:59 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 24 01:08:59 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 24 01:08:59 localhost systemd[1]: Started RPC Bind.
Nov 24 01:08:59 localhost augenrules[708]: /sbin/augenrules: No change
Nov 24 01:08:59 localhost augenrules[723]: No rules
Nov 24 01:08:59 localhost augenrules[723]: enabled 1
Nov 24 01:08:59 localhost augenrules[723]: failure 1
Nov 24 01:08:59 localhost augenrules[723]: pid 703
Nov 24 01:08:59 localhost augenrules[723]: rate_limit 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_limit 8192
Nov 24 01:08:59 localhost augenrules[723]: lost 0
Nov 24 01:08:59 localhost augenrules[723]: backlog 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 01:08:59 localhost augenrules[723]: enabled 1
Nov 24 01:08:59 localhost augenrules[723]: failure 1
Nov 24 01:08:59 localhost augenrules[723]: pid 703
Nov 24 01:08:59 localhost augenrules[723]: rate_limit 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_limit 8192
Nov 24 01:08:59 localhost augenrules[723]: lost 0
Nov 24 01:08:59 localhost augenrules[723]: backlog 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 01:08:59 localhost augenrules[723]: enabled 1
Nov 24 01:08:59 localhost augenrules[723]: failure 1
Nov 24 01:08:59 localhost augenrules[723]: pid 703
Nov 24 01:08:59 localhost augenrules[723]: rate_limit 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_limit 8192
Nov 24 01:08:59 localhost augenrules[723]: lost 0
Nov 24 01:08:59 localhost augenrules[723]: backlog 0
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 01:08:59 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 01:08:59 localhost systemd[1]: Started Security Auditing Service.
Nov 24 01:08:59 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 24 01:08:59 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 24 01:08:59 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 24 01:08:59 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 01:08:59 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 24 01:08:59 localhost systemd[1]: Starting Update is Completed...
Nov 24 01:08:59 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 01:08:59 localhost systemd[1]: Finished Update is Completed.
Nov 24 01:08:59 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 01:08:59 localhost systemd[1]: Reached target System Initialization.
Nov 24 01:08:59 localhost systemd[1]: Started dnf makecache --timer.
Nov 24 01:08:59 localhost systemd[1]: Started Daily rotation of log files.
Nov 24 01:08:59 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 24 01:08:59 localhost systemd[1]: Reached target Timer Units.
Nov 24 01:08:59 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 01:08:59 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 24 01:08:59 localhost systemd[1]: Reached target Socket Units.
Nov 24 01:08:59 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 24 01:08:59 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 01:08:59 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 24 01:08:59 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 01:08:59 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 01:08:59 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 01:08:59 localhost systemd-udevd[748]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:08:59 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 24 01:08:59 localhost systemd[1]: Reached target Basic System.
Nov 24 01:08:59 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 24 01:08:59 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 24 01:08:59 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 24 01:08:59 localhost dbus-broker-lau[770]: Ready
Nov 24 01:08:59 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 24 01:08:59 localhost systemd[1]: Starting NTP client/server...
Nov 24 01:08:59 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 24 01:08:59 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 24 01:08:59 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 24 01:08:59 localhost systemd[1]: Started irqbalance daemon.
Nov 24 01:08:59 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 24 01:08:59 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:08:59 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:08:59 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:08:59 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 24 01:08:59 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 24 01:08:59 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 24 01:08:59 localhost systemd[1]: Starting User Login Management...
Nov 24 01:08:59 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 24 01:08:59 localhost chronyd[794]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 01:08:59 localhost chronyd[794]: Loaded 0 symmetric keys
Nov 24 01:08:59 localhost chronyd[794]: Using right/UTC timezone to obtain leap second data
Nov 24 01:08:59 localhost chronyd[794]: Loaded seccomp filter (level 2)
Nov 24 01:08:59 localhost systemd[1]: Started NTP client/server.
Nov 24 01:08:59 localhost systemd-logind[791]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 01:08:59 localhost systemd-logind[791]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 01:08:59 localhost systemd-logind[791]: New seat seat0.
Nov 24 01:08:59 localhost systemd[1]: Started User Login Management.
Nov 24 01:08:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 24 01:08:59 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 24 01:08:59 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 24 01:08:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 24 01:08:59 localhost kernel: kvm_amd: TSC scaling supported
Nov 24 01:08:59 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 24 01:08:59 localhost kernel: kvm_amd: Nested Paging enabled
Nov 24 01:08:59 localhost kernel: kvm_amd: LBR virtualization supported
Nov 24 01:08:59 localhost kernel: Console: switching to colour dummy device 80x25
Nov 24 01:08:59 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 24 01:08:59 localhost kernel: [drm] features: -context_init
Nov 24 01:08:59 localhost kernel: [drm] number of scanouts: 1
Nov 24 01:08:59 localhost kernel: [drm] number of cap sets: 0
Nov 24 01:08:59 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 24 01:08:59 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 24 01:08:59 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 24 01:08:59 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 24 01:08:59 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Nov 24 01:08:59 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 24 01:09:00 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 24 Nov 2025 01:09:00 +0000. Up 6.98 seconds.
Nov 24 01:09:00 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 24 01:09:00 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 24 01:09:00 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpi4rf076v.mount: Deactivated successfully.
Nov 24 01:09:00 localhost systemd[1]: Starting Hostname Service...
Nov 24 01:09:00 localhost systemd[1]: Started Hostname Service.
Nov 24 01:09:00 np0005532889.novalocal systemd-hostnamed[854]: Hostname set to <np0005532889.novalocal> (static)
Nov 24 01:09:00 np0005532889.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 24 01:09:00 np0005532889.novalocal systemd[1]: Reached target Preparation for Network.
Nov 24 01:09:00 np0005532889.novalocal systemd[1]: Starting Network Manager...
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9318] NetworkManager (version 1.54.1-1.el9) is starting... (boot:c54d865c-1bf1-4cad-ad82-0976a3ee1591)
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9324] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9540] manager[0x56513b6ff080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9608] hostname: hostname: using hostnamed
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9608] hostname: static hostname changed from (none) to "np0005532889.novalocal"
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9615] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9787] manager[0x56513b6ff080]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9788] manager[0x56513b6ff080]: rfkill: WWAN hardware radio set enabled
Nov 24 01:09:00 np0005532889.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9945] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9946] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9947] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9947] manager: Networking is enabled by state file
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9951] settings: Loaded settings plugin: keyfile (internal)
Nov 24 01:09:00 np0005532889.novalocal NetworkManager[858]: <info>  [1763946540.9989] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0024] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0058] dhcp: init: Using DHCP client 'internal'
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0061] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0076] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0090] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0098] device (lo): Activation: starting connection 'lo' (da7d2480-bf68-42ff-860c-6b8f4466c871)
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0109] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0112] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0141] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0145] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0148] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0150] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0152] device (eth0): carrier: link connected
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0155] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0162] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0167] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0174] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0175] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0178] manager: NetworkManager state is now CONNECTING
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0180] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0187] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0191] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0248] dhcp4 (eth0): state changed new lease, address=38.102.83.32
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0295] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Started Network Manager.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Reached target Network.
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0387] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0571] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0576] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0584] device (lo): Activation: successful, device activated.
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0592] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0594] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0598] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0602] device (eth0): Activation: successful, device activated.
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0610] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 01:09:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946541.0613] manager: startup complete
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Reached target NFS client services.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Reached target Remote File Systems.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 01:09:01 np0005532889.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 24 Nov 2025 01:09:01 +0000. Up 8.10 seconds.
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |  eth0  | True |         38.102.83.32         | 255.255.255.0 | global | fa:16:3e:b9:68:c1 |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:feb9:68c1/64 |       .       |  link  | fa:16:3e:b9:68:c1 |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 24 01:09:01 np0005532889.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: new group: name=cloud-user, GID=1001
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: add 'cloud-user' to group 'adm'
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: add 'cloud-user' to group 'systemd-journal'
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: add 'cloud-user' to shadow group 'adm'
Nov 24 01:09:02 np0005532889.novalocal useradd[987]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Generating public/private rsa key pair.
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: SHA256:ZiMlVSynfIWIBPJuynfVjAOEPdfu0x5cFK62JS4wdVk root@np0005532889.novalocal
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +---[RSA 3072]----+
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |  . .=o..=..  E. |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |   o..+.+ = .+.  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |    . o+.=..o..  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |   .   +o=o. ..  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |    o . Sooo+..  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: | . o   = =oo++   |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |  o . .   .oo.   |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |   . .     ..    |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |                 |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Generating public/private ecdsa key pair.
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: SHA256:3awynt+uH3u5ZfiFzd/w+uvMklakCexnYz16bWOThTk root@np0005532889.novalocal
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +---[ECDSA 256]---+
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |                 |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |                 |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |          .      |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |         . =   . |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        S o + =o |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |           o BEO.|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        o . = Oo@|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |       . + . B=%*|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        o.o+*.BXX|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Generating public/private ed25519 key pair.
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: SHA256:v0OKNbGTm7+2/6z+n3iNr+ukO8d2BatwSRpFTI0cQDI root@np0005532889.novalocal
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +--[ED25519 256]--+
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        E.o*++   |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |         o  = .  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |           .     |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        . . . .  |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        S+ + . o |
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |        *.+ o . .|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |       o B.o o..o|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |      . + o.o+*.+|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: |         o=**@OBo|
Nov 24 01:09:02 np0005532889.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Reached target Network is Online.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting System Logging Service...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting Permit User Sessions...
Nov 24 01:09:02 np0005532889.novalocal sm-notify[1003]: Version 2.5.4 starting
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 24 01:09:02 np0005532889.novalocal sshd[1005]: Server listening on 0.0.0.0 port 22.
Nov 24 01:09:02 np0005532889.novalocal sshd[1005]: Server listening on :: port 22.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Finished Permit User Sessions.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started Command Scheduler.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started Getty on tty1.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 24 01:09:02 np0005532889.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Nov 24 01:09:02 np0005532889.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Reached target Login Prompts.
Nov 24 01:09:02 np0005532889.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 83% if used.)
Nov 24 01:09:02 np0005532889.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Nov 24 01:09:02 np0005532889.novalocal rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Nov 24 01:09:02 np0005532889.novalocal rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Started System Logging Service.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Reached target Multi-User System.
Nov 24 01:09:02 np0005532889.novalocal sshd-session[1017]: Unable to negotiate with 38.102.83.114 port 46876: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 24 01:09:02 np0005532889.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 24 01:09:02 np0005532889.novalocal rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:09:02 np0005532889.novalocal sshd-session[1035]: Unable to negotiate with 38.102.83.114 port 46896: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1048]: Unable to negotiate with 38.102.83.114 port 46898: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1008]: Connection closed by 38.102.83.114 port 46866 [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1075]: Unable to negotiate with 38.102.83.114 port 46918: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1027]: Connection closed by 38.102.83.114 port 46884 [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1078]: Unable to negotiate with 38.102.83.114 port 46926: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 24 01:09:03 np0005532889.novalocal kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Nov 24 01:09:03 np0005532889.novalocal kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1057]: Connection closed by 38.102.83.114 port 46914 [preauth]
Nov 24 01:09:03 np0005532889.novalocal sshd-session[1068]: Connection closed by 38.102.83.114 port 46916 [preauth]
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1149]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 24 Nov 2025 01:09:03 +0000. Up 9.90 seconds.
Nov 24 01:09:03 np0005532889.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 24 01:09:03 np0005532889.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 24 01:09:03 np0005532889.novalocal dracut[1283]: dracut-057-102.git20250818.el9
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1300]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 24 Nov 2025 01:09:03 +0000. Up 10.32 seconds.
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1302]: #############################################################
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1303]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1305]: 256 SHA256:3awynt+uH3u5ZfiFzd/w+uvMklakCexnYz16bWOThTk root@np0005532889.novalocal (ECDSA)
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1307]: 256 SHA256:v0OKNbGTm7+2/6z+n3iNr+ukO8d2BatwSRpFTI0cQDI root@np0005532889.novalocal (ED25519)
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1309]: 3072 SHA256:ZiMlVSynfIWIBPJuynfVjAOEPdfu0x5cFK62JS4wdVk root@np0005532889.novalocal (RSA)
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1311]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1313]: #############################################################
Nov 24 01:09:03 np0005532889.novalocal cloud-init[1300]: Cloud-init v. 24.4-7.el9 finished at Mon, 24 Nov 2025 01:09:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.53 seconds
Nov 24 01:09:03 np0005532889.novalocal dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 24 01:09:03 np0005532889.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 24 01:09:03 np0005532889.novalocal systemd[1]: Reached target Cloud-init target.
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 01:09:04 np0005532889.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: memstrack is not available
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: memstrack is not available
Nov 24 01:09:05 np0005532889.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 01:09:05 np0005532889.novalocal chronyd[794]: Selected source 142.4.192.253 (2.centos.pool.ntp.org)
Nov 24 01:09:05 np0005532889.novalocal chronyd[794]: System clock TAI offset set to 37 seconds
Nov 24 01:09:06 np0005532889.novalocal dracut[1286]: *** Including module: systemd ***
Nov 24 01:09:06 np0005532889.novalocal dracut[1286]: *** Including module: fips ***
Nov 24 01:09:06 np0005532889.novalocal dracut[1286]: *** Including module: systemd-initrd ***
Nov 24 01:09:06 np0005532889.novalocal dracut[1286]: *** Including module: i18n ***
Nov 24 01:09:06 np0005532889.novalocal dracut[1286]: *** Including module: drm ***
Nov 24 01:09:07 np0005532889.novalocal dracut[1286]: *** Including module: prefixdevname ***
Nov 24 01:09:07 np0005532889.novalocal dracut[1286]: *** Including module: kernel-modules ***
Nov 24 01:09:07 np0005532889.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 24 01:09:08 np0005532889.novalocal chronyd[794]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: kernel-modules-extra ***
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: qemu ***
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: fstab-sys ***
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: rootfs-block ***
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: terminfo ***
Nov 24 01:09:08 np0005532889.novalocal dracut[1286]: *** Including module: udev-rules ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: Skipping udev rule: 91-permissions.rules
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: virtiofs ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: dracut-systemd ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: usrmount ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: base ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: fs-lib ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: kdumpbase ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]:   microcode_ctl module: mangling fw_dir
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 24 01:09:09 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 25 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 31 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 28 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 32 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 30 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 24 01:09:10 np0005532889.novalocal irqbalance[784]: IRQ 29 affinity is now unmanaged
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]: *** Including module: openssl ***
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]: *** Including module: shutdown ***
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]: *** Including module: squash ***
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]: *** Including modules done ***
Nov 24 01:09:10 np0005532889.novalocal dracut[1286]: *** Installing kernel module dependencies ***
Nov 24 01:09:11 np0005532889.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:09:11 np0005532889.novalocal dracut[1286]: *** Installing kernel module dependencies done ***
Nov 24 01:09:11 np0005532889.novalocal dracut[1286]: *** Resolving executable dependencies ***
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: *** Resolving executable dependencies done ***
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: *** Generating early-microcode cpio image ***
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: *** Store current command line parameters ***
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: Stored kernel commandline:
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Nov 24 01:09:13 np0005532889.novalocal dracut[1286]: *** Install squash loader ***
Nov 24 01:09:14 np0005532889.novalocal dracut[1286]: *** Squashing the files inside the initramfs ***
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: *** Squashing the files inside the initramfs done ***
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: *** Hardlinking files ***
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Mode:           real
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Files:          50
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Linked:         0 files
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Compared:       0 xattrs
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Compared:       0 files
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Saved:          0 B
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: Duration:       0.000427 seconds
Nov 24 01:09:15 np0005532889.novalocal dracut[1286]: *** Hardlinking files done ***
Nov 24 01:09:16 np0005532889.novalocal dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 24 01:09:16 np0005532889.novalocal kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Nov 24 01:09:16 np0005532889.novalocal kdumpctl[1019]: kdump: Starting kdump: [OK]
Nov 24 01:09:16 np0005532889.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 24 01:09:16 np0005532889.novalocal systemd[1]: Startup finished in 1.599s (kernel) + 2.929s (initrd) + 19.020s (userspace) = 23.549s.
Nov 24 01:09:17 np0005532889.novalocal sshd-session[4294]: Accepted publickey for zuul from 38.102.83.114 port 52422 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 24 01:09:17 np0005532889.novalocal systemd-logind[791]: New session 1 of user zuul.
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Queued start job for default target Main User Target.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Created slice User Application Slice.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Reached target Paths.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Reached target Timers.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Starting D-Bus User Message Bus Socket...
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Starting Create User's Volatile Files and Directories...
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Listening on D-Bus User Message Bus Socket.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Reached target Sockets.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Finished Create User's Volatile Files and Directories.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Reached target Basic System.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Reached target Main User Target.
Nov 24 01:09:17 np0005532889.novalocal systemd[4298]: Startup finished in 164ms.
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 24 01:09:17 np0005532889.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 24 01:09:17 np0005532889.novalocal sshd-session[4294]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:09:18 np0005532889.novalocal python3[4380]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:09:20 np0005532889.novalocal python3[4408]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:09:26 np0005532889.novalocal python3[4466]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:09:28 np0005532889.novalocal python3[4506]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 24 01:09:30 np0005532889.novalocal python3[4532]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD1vkjvjk6iw4mP9XVYC1PqW/Mrbrb/u8RkVV/nVUhM1Wp0xIOn5rxZNMdOq1jvAqEjNawpI7G22OH9kuZ5IQkUMw0OY3aQ+/P8IH1BC4cgtuG5hEbxAhadirS00S6rKO6jCEq0qk3oN7wnYYfP+mhwXFQ1N6XUwbjX4zvMEvFzn/fKNeJoiATPI6ac+T7XXe4YLpaLf9aNwzSA9EV2J5C960VRo8NSo/p9GjVLyUhE3aXHU2Le8KEXkucPlkxFrk3MCTcqJtGzrNBh6U7VjawFVsGXjtG72FxvHSbopX719jxoWmuu4ms5VJtDBTYmOL02UX9659/x8KhFCT9Z9yc1v2wagAJFI+vsyMisU5yp6kyE9UQfOHFHZeJHVjvdhHvGVlIFasJVgxAqo6WGyxQ4++t+gicw79dtJ+GEUdbDBiNRwlaslPH8IeQ0/L0plieArpSDvKKhepzqk4qXoioENj04JiduIH1ygPgwMfrRufwba1u+YN4J8PiVBy+xq3c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:30 np0005532889.novalocal python3[4556]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:30 np0005532889.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 01:09:31 np0005532889.novalocal python3[4656]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:31 np0005532889.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763946570.6686714-207-214723492912263/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3ef1e7f530e04dc2af06e09d66590a1e_id_rsa follow=False checksum=9c0d62b0369bc1d5c58b625049909fe38dd3a696 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:32 np0005532889.novalocal sshd-session[4557]: Invalid user support from 185.156.73.233 port 21538
Nov 24 01:09:32 np0005532889.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:32 np0005532889.novalocal sshd-session[4557]: Connection closed by invalid user support 185.156.73.233 port 21538 [preauth]
Nov 24 01:09:32 np0005532889.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763946571.6994374-240-77838062801202/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3ef1e7f530e04dc2af06e09d66590a1e_id_rsa.pub follow=False checksum=6bb315e4476444ce6e7b457b12681aa6d4d46e70 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:33 np0005532889.novalocal python3[4972]: ansible-ping Invoked with data=pong
Nov 24 01:09:34 np0005532889.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:09:36 np0005532889.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 24 01:09:37 np0005532889.novalocal python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:37 np0005532889.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:38 np0005532889.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:38 np0005532889.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:38 np0005532889.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:39 np0005532889.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:40 np0005532889.novalocal sudo[5230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-karkybsmwjlawayzgsidtlvsbuuealwz ; /usr/bin/python3'
Nov 24 01:09:40 np0005532889.novalocal sudo[5230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:40 np0005532889.novalocal python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:40 np0005532889.novalocal sudo[5230]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:41 np0005532889.novalocal sudo[5308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwlglwbzavutlqzszxratartdhgpxnhv ; /usr/bin/python3'
Nov 24 01:09:41 np0005532889.novalocal sudo[5308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:41 np0005532889.novalocal python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:41 np0005532889.novalocal sudo[5308]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:41 np0005532889.novalocal sudo[5381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-howdplnoxsyznjcoyxjlsrxhjvjbyusz ; /usr/bin/python3'
Nov 24 01:09:41 np0005532889.novalocal sudo[5381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:41 np0005532889.novalocal python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763946581.0007164-21-132845054326685/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:41 np0005532889.novalocal sudo[5381]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:42 np0005532889.novalocal python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:42 np0005532889.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:43 np0005532889.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:43 np0005532889.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:43 np0005532889.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:44 np0005532889.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:44 np0005532889.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:44 np0005532889.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:45 np0005532889.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:45 np0005532889.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:45 np0005532889.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:45 np0005532889.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:46 np0005532889.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:46 np0005532889.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:46 np0005532889.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:47 np0005532889.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:47 np0005532889.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:47 np0005532889.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:47 np0005532889.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:48 np0005532889.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:48 np0005532889.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:48 np0005532889.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:49 np0005532889.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:49 np0005532889.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:49 np0005532889.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:50 np0005532889.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:09:52 np0005532889.novalocal sudo[6055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqdywlzbtjmfcrhcxqzslfksakmljjr ; /usr/bin/python3'
Nov 24 01:09:52 np0005532889.novalocal sudo[6055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:52 np0005532889.novalocal python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 01:09:52 np0005532889.novalocal systemd[1]: Starting Time & Date Service...
Nov 24 01:09:52 np0005532889.novalocal systemd[1]: Started Time & Date Service.
Nov 24 01:09:52 np0005532889.novalocal systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Nov 24 01:09:52 np0005532889.novalocal sudo[6055]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:53 np0005532889.novalocal sudo[6086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifcqvxooqeqrlawaamxopptkvemydqsg ; /usr/bin/python3'
Nov 24 01:09:53 np0005532889.novalocal sudo[6086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:53 np0005532889.novalocal python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:53 np0005532889.novalocal sudo[6086]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:53 np0005532889.novalocal python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:54 np0005532889.novalocal python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763946593.4678247-153-15986285494007/source _original_basename=tmptz14k920 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:54 np0005532889.novalocal python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:55 np0005532889.novalocal python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763946594.4712348-183-12486005778688/source _original_basename=tmpxzyeyfdl follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:55 np0005532889.novalocal sudo[6506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqxvuszbjyoifixakciayetavkqyuzc ; /usr/bin/python3'
Nov 24 01:09:55 np0005532889.novalocal sudo[6506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:55 np0005532889.novalocal python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:55 np0005532889.novalocal sudo[6506]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:56 np0005532889.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvdevbkpbgwzgtproqnmawxaxrbvcikr ; /usr/bin/python3'
Nov 24 01:09:56 np0005532889.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:56 np0005532889.novalocal python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763946595.519587-231-92410226863144/source _original_basename=tmp_9uqtth6 follow=False checksum=240d352bdcd1801ce67a7c834b4d8739ebffa02c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:56 np0005532889.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:56 np0005532889.novalocal python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:09:57 np0005532889.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:09:57 np0005532889.novalocal sudo[6733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvkayjyhuslhhvwyswitqezznofyscmh ; /usr/bin/python3'
Nov 24 01:09:57 np0005532889.novalocal sudo[6733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:57 np0005532889.novalocal python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:09:57 np0005532889.novalocal sudo[6733]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:57 np0005532889.novalocal sudo[6806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhorhhpxfscmxtnpqzpokotddyzpjins ; /usr/bin/python3'
Nov 24 01:09:57 np0005532889.novalocal sudo[6806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:58 np0005532889.novalocal python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763946597.271073-273-264007623212849/source _original_basename=tmpvqle2wpc follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:09:58 np0005532889.novalocal sudo[6806]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:58 np0005532889.novalocal sudo[6857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vabkgujcffwyrunojusbpfyqjumpoyug ; /usr/bin/python3'
Nov 24 01:09:58 np0005532889.novalocal sudo[6857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:09:58 np0005532889.novalocal python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-425b-fa00-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:09:58 np0005532889.novalocal sudo[6857]: pam_unix(sudo:session): session closed for user root
Nov 24 01:09:59 np0005532889.novalocal python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-425b-fa00-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 24 01:10:00 np0005532889.novalocal python3[6916]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:10:17 np0005532889.novalocal sudo[6940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lafmnmfuwnhxvfpwppremiicbailgait ; /usr/bin/python3'
Nov 24 01:10:17 np0005532889.novalocal sudo[6940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:10:17 np0005532889.novalocal python3[6942]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:10:17 np0005532889.novalocal sudo[6940]: pam_unix(sudo:session): session closed for user root
Nov 24 01:10:22 np0005532889.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 24 01:10:52 np0005532889.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 24 01:10:52 np0005532889.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7366] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 01:10:52 np0005532889.novalocal systemd-udevd[6946]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7653] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7696] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7705] device (eth1): carrier: link connected
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7707] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7717] policy: auto-activating connection 'Wired connection 1' (798898cd-a7a8-3202-830a-832499ef243d)
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7725] device (eth1): Activation: starting connection 'Wired connection 1' (798898cd-a7a8-3202-830a-832499ef243d)
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7726] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7731] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7737] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:10:52 np0005532889.novalocal NetworkManager[858]: <info>  [1763946652.7743] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:10:53 np0005532889.novalocal python3[6972]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-2c63-7dbe-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:11:00 np0005532889.novalocal sudo[7050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kainthqopqushlapbpejfoujtapgesjq ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 01:11:00 np0005532889.novalocal sudo[7050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:11:00 np0005532889.novalocal python3[7052]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:11:00 np0005532889.novalocal sudo[7050]: pam_unix(sudo:session): session closed for user root
Nov 24 01:11:00 np0005532889.novalocal sudo[7123]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsmfwqmqgqtumyhbivogsgtdpcjgskle ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 01:11:00 np0005532889.novalocal sudo[7123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:11:01 np0005532889.novalocal python3[7125]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763946660.2939446-102-190946920137829/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=00683718b5d8ada77c5c8e69419bd1418772ecd1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:11:01 np0005532889.novalocal sudo[7123]: pam_unix(sudo:session): session closed for user root
Nov 24 01:11:01 np0005532889.novalocal sudo[7173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzaqtfkdeavsmaieriuhmijqkbpgclcu ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 01:11:01 np0005532889.novalocal sudo[7173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:11:01 np0005532889.novalocal python3[7175]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Stopping Network Manager...
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.8931] caught SIGTERM, shutting down normally.
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.8952] dhcp4 (eth0): canceled DHCP transaction
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.8953] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.8953] dhcp4 (eth0): state changed no lease
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.8958] manager: NetworkManager state is now CONNECTING
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.9079] dhcp4 (eth1): canceled DHCP transaction
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.9079] dhcp4 (eth1): state changed no lease
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[858]: <info>  [1763946661.9133] exiting (success)
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Stopped Network Manager.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: NetworkManager.service: Consumed 1.001s CPU time, 10.0M memory peak.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Starting Network Manager...
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946661.9758] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c54d865c-1bf1-4cad-ad82-0976a3ee1591)
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946661.9761] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 01:11:01 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946661.9807] manager[0x563048b6a070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 01:11:01 np0005532889.novalocal systemd[1]: Starting Hostname Service...
Nov 24 01:11:02 np0005532889.novalocal systemd[1]: Started Hostname Service.
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0886] hostname: hostname: using hostnamed
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0889] hostname: static hostname changed from (none) to "np0005532889.novalocal"
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0895] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0901] manager[0x563048b6a070]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0902] manager[0x563048b6a070]: rfkill: WWAN hardware radio set enabled
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0945] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0945] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0946] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0946] manager: Networking is enabled by state file
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0950] settings: Loaded settings plugin: keyfile (internal)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.0957] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1004] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1021] dhcp: init: Using DHCP client 'internal'
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1025] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1033] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1042] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1053] device (lo): Activation: starting connection 'lo' (da7d2480-bf68-42ff-860c-6b8f4466c871)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1063] device (eth0): carrier: link connected
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1071] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1082] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1083] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1094] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1104] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1113] device (eth1): carrier: link connected
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1120] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1127] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (798898cd-a7a8-3202-830a-832499ef243d) (indicated)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1127] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1135] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1146] device (eth1): Activation: starting connection 'Wired connection 1' (798898cd-a7a8-3202-830a-832499ef243d)
Nov 24 01:11:02 np0005532889.novalocal systemd[1]: Started Network Manager.
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1156] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1166] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1170] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1173] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1176] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1192] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1198] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1205] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1212] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1225] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1231] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1253] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1257] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1292] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1301] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1313] device (lo): Activation: successful, device activated.
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1326] dhcp4 (eth0): state changed new lease, address=38.102.83.32
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1338] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 01:11:02 np0005532889.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1440] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1510] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1513] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1522] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1529] device (eth0): Activation: successful, device activated.
Nov 24 01:11:02 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946662.1541] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 01:11:02 np0005532889.novalocal sudo[7173]: pam_unix(sudo:session): session closed for user root
Nov 24 01:11:02 np0005532889.novalocal python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-2c63-7dbe-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:11:03 np0005532889.novalocal sshd-session[7262]: Connection closed by 156.67.62.181 port 55500
Nov 24 01:11:04 np0005532889.novalocal sshd-session[7263]: Invalid user a from 156.67.62.181 port 55514
Nov 24 01:11:04 np0005532889.novalocal sshd-session[7263]: Connection closed by invalid user a 156.67.62.181 port 55514 [preauth]
Nov 24 01:11:12 np0005532889.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:11:24 np0005532889.novalocal systemd[4298]: Starting Mark boot as successful...
Nov 24 01:11:24 np0005532889.novalocal systemd[4298]: Finished Mark boot as successful.
Nov 24 01:11:32 np0005532889.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.2842] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 01:11:47 np0005532889.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:11:47 np0005532889.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3151] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3153] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3160] device (eth1): Activation: successful, device activated.
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3166] manager: startup complete
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3168] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <warn>  [1763946707.3172] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3179] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3279] dhcp4 (eth1): canceled DHCP transaction
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3279] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3279] dhcp4 (eth1): state changed no lease
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3296] policy: auto-activating connection 'ci-private-network' (e2301243-2a8f-5270-a46e-0de358c9532a)
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3301] device (eth1): Activation: starting connection 'ci-private-network' (e2301243-2a8f-5270-a46e-0de358c9532a)
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3302] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3306] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3313] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3323] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3365] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3367] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:11:47 np0005532889.novalocal NetworkManager[7185]: <info>  [1763946707.3375] device (eth1): Activation: successful, device activated.
Nov 24 01:11:57 np0005532889.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:12:01 np0005532889.novalocal sudo[7366]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlyjjnbjitpnlczbqourvlorzgsnmjfq ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 01:12:01 np0005532889.novalocal sudo[7366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:12:01 np0005532889.novalocal python3[7368]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:12:01 np0005532889.novalocal sudo[7366]: pam_unix(sudo:session): session closed for user root
Nov 24 01:12:02 np0005532889.novalocal sudo[7439]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blnmpdcwiyotmcedxtzhlukkojtmgcwy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 01:12:02 np0005532889.novalocal sudo[7439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:12:02 np0005532889.novalocal python3[7441]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763946721.5173888-259-32805871237746/source _original_basename=tmpl6zusi59 follow=False checksum=5f8984173faccbca0705cea768b31ffe9bd25568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:12:02 np0005532889.novalocal sudo[7439]: pam_unix(sudo:session): session closed for user root
Nov 24 01:13:02 np0005532889.novalocal sshd-session[4307]: Received disconnect from 38.102.83.114 port 52422:11: disconnected by user
Nov 24 01:13:02 np0005532889.novalocal sshd-session[4307]: Disconnected from user zuul 38.102.83.114 port 52422
Nov 24 01:13:02 np0005532889.novalocal sshd-session[4294]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:13:02 np0005532889.novalocal systemd-logind[791]: Session 1 logged out. Waiting for processes to exit.
Nov 24 01:14:24 np0005532889.novalocal systemd[4298]: Created slice User Background Tasks Slice.
Nov 24 01:14:24 np0005532889.novalocal systemd[4298]: Starting Cleanup of User's Temporary Files and Directories...
Nov 24 01:14:24 np0005532889.novalocal systemd[4298]: Finished Cleanup of User's Temporary Files and Directories.
Nov 24 01:16:54 np0005532889.novalocal sshd-session[7470]: Accepted publickey for zuul from 38.102.83.114 port 60742 ssh2: RSA SHA256:bpZNyMYYoO203pco3j6B+eO5t/MKKepnVkUBttgUZEY
Nov 24 01:16:54 np0005532889.novalocal systemd-logind[791]: New session 3 of user zuul.
Nov 24 01:16:54 np0005532889.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 24 01:16:54 np0005532889.novalocal sshd-session[7470]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:16:54 np0005532889.novalocal sudo[7497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urnlahdnalwijfioplvlxpxkeukuwokh ; /usr/bin/python3'
Nov 24 01:16:54 np0005532889.novalocal sudo[7497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:54 np0005532889.novalocal python3[7499]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-93e4-a08d-000000001cc4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:16:54 np0005532889.novalocal sudo[7497]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:54 np0005532889.novalocal sudo[7526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlprvrqahzvejxleldyxhkrmvoidqmev ; /usr/bin/python3'
Nov 24 01:16:54 np0005532889.novalocal sudo[7526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:54 np0005532889.novalocal python3[7528]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:54 np0005532889.novalocal sudo[7526]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:55 np0005532889.novalocal sudo[7552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxrwlbfxtxlufxvmuynpokimrrdyrfmm ; /usr/bin/python3'
Nov 24 01:16:55 np0005532889.novalocal sudo[7552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:55 np0005532889.novalocal python3[7554]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:55 np0005532889.novalocal sudo[7552]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:55 np0005532889.novalocal sudo[7578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqxfywzqosmqseodgkaljptvypnxzizy ; /usr/bin/python3'
Nov 24 01:16:55 np0005532889.novalocal sudo[7578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:55 np0005532889.novalocal python3[7580]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:55 np0005532889.novalocal sudo[7578]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:55 np0005532889.novalocal sudo[7604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumzpvfwlobyowdwkwxkdhpewjdrdulk ; /usr/bin/python3'
Nov 24 01:16:55 np0005532889.novalocal sudo[7604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:55 np0005532889.novalocal python3[7606]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:55 np0005532889.novalocal sudo[7604]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:56 np0005532889.novalocal sudo[7630]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdmnlpsaqwtibbaqekexgavnnalygipf ; /usr/bin/python3'
Nov 24 01:16:56 np0005532889.novalocal sudo[7630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:56 np0005532889.novalocal python3[7632]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:56 np0005532889.novalocal sudo[7630]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:56 np0005532889.novalocal sudo[7708]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mncajootmuibmevsjlhecsioxhiznjnn ; /usr/bin/python3'
Nov 24 01:16:56 np0005532889.novalocal sudo[7708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:56 np0005532889.novalocal python3[7710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:16:56 np0005532889.novalocal sudo[7708]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:57 np0005532889.novalocal sudo[7781]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgsthnklvbydcbqjknwohnjskfyfauap ; /usr/bin/python3'
Nov 24 01:16:57 np0005532889.novalocal sudo[7781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:57 np0005532889.novalocal python3[7783]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763947016.51296-468-53074390538376/source _original_basename=tmpvj20sssn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:16:57 np0005532889.novalocal sudo[7781]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:57 np0005532889.novalocal sudo[7831]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpeopkfrpcmwecbcvmertfgvrmhlrxbl ; /usr/bin/python3'
Nov 24 01:16:57 np0005532889.novalocal sudo[7831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:58 np0005532889.novalocal python3[7833]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:16:58 np0005532889.novalocal systemd[1]: Reloading.
Nov 24 01:16:58 np0005532889.novalocal systemd-rc-local-generator[7855]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:16:58 np0005532889.novalocal sudo[7831]: pam_unix(sudo:session): session closed for user root
Nov 24 01:16:59 np0005532889.novalocal sudo[7888]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfihkidxofmmznwumcsxlyxfdaxlvmd ; /usr/bin/python3'
Nov 24 01:16:59 np0005532889.novalocal sudo[7888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:16:59 np0005532889.novalocal python3[7890]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 24 01:16:59 np0005532889.novalocal sudo[7888]: pam_unix(sudo:session): session closed for user root
Nov 24 01:17:00 np0005532889.novalocal sudo[7914]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akjzxrlqgjwelqkymshsgxsvgaxzagxr ; /usr/bin/python3'
Nov 24 01:17:00 np0005532889.novalocal sudo[7914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:17:00 np0005532889.novalocal python3[7916]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:17:00 np0005532889.novalocal sudo[7914]: pam_unix(sudo:session): session closed for user root
Nov 24 01:17:00 np0005532889.novalocal sudo[7942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxujmtdxtpvyknmxzyvwulsmskahquz ; /usr/bin/python3'
Nov 24 01:17:00 np0005532889.novalocal sudo[7942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:17:00 np0005532889.novalocal python3[7944]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:17:00 np0005532889.novalocal sudo[7942]: pam_unix(sudo:session): session closed for user root
Nov 24 01:17:00 np0005532889.novalocal sudo[7970]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxokoovgihtxqxpvzvvyeoprujdxribt ; /usr/bin/python3'
Nov 24 01:17:00 np0005532889.novalocal sudo[7970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:17:00 np0005532889.novalocal python3[7972]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:17:00 np0005532889.novalocal sudo[7970]: pam_unix(sudo:session): session closed for user root
Nov 24 01:17:00 np0005532889.novalocal sudo[7998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqguvhngwftlcxyymokifumbohyzoelr ; /usr/bin/python3'
Nov 24 01:17:00 np0005532889.novalocal sudo[7998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:17:00 np0005532889.novalocal python3[8000]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:17:00 np0005532889.novalocal sudo[7998]: pam_unix(sudo:session): session closed for user root
Nov 24 01:17:01 np0005532889.novalocal python3[8027]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-93e4-a08d-000000001ccb-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:17:02 np0005532889.novalocal python3[8057]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 24 01:17:03 np0005532889.novalocal sshd-session[7473]: Connection closed by 38.102.83.114 port 60742
Nov 24 01:17:03 np0005532889.novalocal sshd-session[7470]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:17:03 np0005532889.novalocal systemd-logind[791]: Session 3 logged out. Waiting for processes to exit.
Nov 24 01:17:03 np0005532889.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 24 01:17:03 np0005532889.novalocal systemd[1]: session-3.scope: Consumed 3.970s CPU time.
Nov 24 01:17:03 np0005532889.novalocal systemd-logind[791]: Removed session 3.
Nov 24 01:17:05 np0005532889.novalocal sshd-session[8063]: Accepted publickey for zuul from 38.102.83.114 port 51804 ssh2: RSA SHA256:bpZNyMYYoO203pco3j6B+eO5t/MKKepnVkUBttgUZEY
Nov 24 01:17:05 np0005532889.novalocal systemd-logind[791]: New session 4 of user zuul.
Nov 24 01:17:05 np0005532889.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 24 01:17:05 np0005532889.novalocal sshd-session[8063]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:17:05 np0005532889.novalocal sudo[8090]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onuduypxsrhcjyiijffwaohjuinpityc ; /usr/bin/python3'
Nov 24 01:17:05 np0005532889.novalocal sudo[8090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:17:05 np0005532889.novalocal python3[8092]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 24 01:17:10 np0005532889.novalocal irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 24 01:17:10 np0005532889.novalocal irqbalance[784]: IRQ 27 affinity is now unmanaged
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:17:21 np0005532889.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:17:31 np0005532889.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:17:40 np0005532889.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:17:41 np0005532889.novalocal setsebool[8155]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 24 01:17:41 np0005532889.novalocal setsebool[8155]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:17:52 np0005532889.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:18:12 np0005532889.novalocal dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 01:18:12 np0005532889.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:18:12 np0005532889.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:18:12 np0005532889.novalocal systemd[1]: Reloading.
Nov 24 01:18:12 np0005532889.novalocal systemd-rc-local-generator[8910]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:18:12 np0005532889.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:18:14 np0005532889.novalocal sudo[8090]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:22 np0005532889.novalocal python3[14512]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-e535-c612-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:18:22 np0005532889.novalocal kernel: evm: overlay not supported
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: Starting D-Bus User Message Bus...
Nov 24 01:18:23 np0005532889.novalocal dbus-broker-launch[14918]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 24 01:18:23 np0005532889.novalocal dbus-broker-launch[14918]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: Started D-Bus User Message Bus.
Nov 24 01:18:23 np0005532889.novalocal dbus-broker-lau[14918]: Ready
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: Created slice Slice /user.
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: podman-14852.scope: unit configures an IP firewall, but not running as root.
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: (This warning is only shown for the first unit using IP firewalling.)
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: Started podman-14852.scope.
Nov 24 01:18:23 np0005532889.novalocal systemd[4298]: Started podman-pause-39f43832.scope.
Nov 24 01:18:24 np0005532889.novalocal sudo[15435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjerzvelvcifggxoztrqjcvmpvenrsmw ; /usr/bin/python3'
Nov 24 01:18:24 np0005532889.novalocal sudo[15435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:24 np0005532889.novalocal python3[15450]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.66:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.66:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:18:24 np0005532889.novalocal python3[15450]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 24 01:18:24 np0005532889.novalocal sudo[15435]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:24 np0005532889.novalocal sshd-session[8066]: Connection closed by 38.102.83.114 port 51804
Nov 24 01:18:24 np0005532889.novalocal sshd-session[8063]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:18:24 np0005532889.novalocal systemd-logind[791]: Session 4 logged out. Waiting for processes to exit.
Nov 24 01:18:24 np0005532889.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 24 01:18:24 np0005532889.novalocal systemd[1]: session-4.scope: Consumed 1min 547ms CPU time.
Nov 24 01:18:24 np0005532889.novalocal systemd-logind[791]: Removed session 4.
Nov 24 01:18:42 np0005532889.novalocal sshd-session[22596]: Connection closed by 38.102.83.111 port 52586 [preauth]
Nov 24 01:18:42 np0005532889.novalocal sshd-session[22597]: Connection closed by 38.102.83.111 port 52590 [preauth]
Nov 24 01:18:42 np0005532889.novalocal sshd-session[22593]: Unable to negotiate with 38.102.83.111 port 52602: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 24 01:18:42 np0005532889.novalocal sshd-session[22599]: Unable to negotiate with 38.102.83.111 port 52616: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 24 01:18:42 np0005532889.novalocal sshd-session[22601]: Unable to negotiate with 38.102.83.111 port 52626: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 24 01:18:46 np0005532889.novalocal sshd-session[24141]: Accepted publickey for zuul from 38.102.83.114 port 56300 ssh2: RSA SHA256:bpZNyMYYoO203pco3j6B+eO5t/MKKepnVkUBttgUZEY
Nov 24 01:18:46 np0005532889.novalocal systemd-logind[791]: New session 5 of user zuul.
Nov 24 01:18:46 np0005532889.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 24 01:18:46 np0005532889.novalocal sshd-session[24141]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:18:47 np0005532889.novalocal python3[24254]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNBalVDAine8Pj7a7KfjZJcyQSRL0wpg5jl37cD97FzS/iHPTlCfLiHMmhzr2UuYokJK39RJ2coHTRFtNYzXqE4= zuul@np0005532888.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:18:47 np0005532889.novalocal sudo[24412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdryntosmjarwlfdnneglkatewecyuii ; /usr/bin/python3'
Nov 24 01:18:47 np0005532889.novalocal sudo[24412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:47 np0005532889.novalocal python3[24425]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNBalVDAine8Pj7a7KfjZJcyQSRL0wpg5jl37cD97FzS/iHPTlCfLiHMmhzr2UuYokJK39RJ2coHTRFtNYzXqE4= zuul@np0005532888.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:18:47 np0005532889.novalocal sudo[24412]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:48 np0005532889.novalocal sudo[24702]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbwjlzkuwenuhxdhcbkcclnqvztphuwi ; /usr/bin/python3'
Nov 24 01:18:48 np0005532889.novalocal sudo[24702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:48 np0005532889.novalocal python3[24712]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532889.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 24 01:18:48 np0005532889.novalocal useradd[24783]: new group: name=cloud-admin, GID=1002
Nov 24 01:18:48 np0005532889.novalocal useradd[24783]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 24 01:18:48 np0005532889.novalocal sudo[24702]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:48 np0005532889.novalocal sudo[24937]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospirvjfkhamhqxbzgcizgmgmqcugvgi ; /usr/bin/python3'
Nov 24 01:18:48 np0005532889.novalocal sudo[24937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:49 np0005532889.novalocal python3[24944]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNBalVDAine8Pj7a7KfjZJcyQSRL0wpg5jl37cD97FzS/iHPTlCfLiHMmhzr2UuYokJK39RJ2coHTRFtNYzXqE4= zuul@np0005532888.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 01:18:49 np0005532889.novalocal sudo[24937]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:49 np0005532889.novalocal sudo[25183]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skzphwxaoxehbzsrpqravaavphbqlmnt ; /usr/bin/python3'
Nov 24 01:18:49 np0005532889.novalocal sudo[25183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:49 np0005532889.novalocal python3[25193]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:18:49 np0005532889.novalocal sudo[25183]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:49 np0005532889.novalocal sudo[25426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukysbcldjkcndvjawombbqjiqigbqabu ; /usr/bin/python3'
Nov 24 01:18:49 np0005532889.novalocal sudo[25426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:49 np0005532889.novalocal python3[25439]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763947129.227415-135-138937160140591/source _original_basename=tmpkwmrgsh4 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:18:49 np0005532889.novalocal sudo[25426]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:50 np0005532889.novalocal sudo[25739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aznpnksufvykbrjdxgloekcenlxicaap ; /usr/bin/python3'
Nov 24 01:18:50 np0005532889.novalocal sudo[25739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:18:50 np0005532889.novalocal python3[25751]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 24 01:18:50 np0005532889.novalocal systemd[1]: Starting Hostname Service...
Nov 24 01:18:50 np0005532889.novalocal systemd[1]: Started Hostname Service.
Nov 24 01:18:50 np0005532889.novalocal systemd-hostnamed[25871]: Changed pretty hostname to 'compute-0'
Nov 24 01:18:50 compute-0 systemd-hostnamed[25871]: Hostname set to <compute-0> (static)
Nov 24 01:18:50 compute-0 NetworkManager[7185]: <info>  [1763947130.8262] hostname: static hostname changed from "np0005532889.novalocal" to "compute-0"
Nov 24 01:18:50 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:18:50 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:18:50 compute-0 sudo[25739]: pam_unix(sudo:session): session closed for user root
Nov 24 01:18:51 compute-0 sshd-session[24190]: Connection closed by 38.102.83.114 port 56300
Nov 24 01:18:51 compute-0 sshd-session[24141]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:18:51 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Nov 24 01:18:51 compute-0 systemd[1]: session-5.scope: Consumed 2.300s CPU time.
Nov 24 01:18:51 compute-0 systemd-logind[791]: Session 5 logged out. Waiting for processes to exit.
Nov 24 01:18:51 compute-0 systemd-logind[791]: Removed session 5.
Nov 24 01:19:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:19:02 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:19:02 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:19:02 compute-0 systemd[1]: man-db-cache-update.service: Consumed 55.267s CPU time.
Nov 24 01:19:02 compute-0 systemd[1]: run-r2b089ac40ae14d6184879a21410a2275.service: Deactivated successfully.
Nov 24 01:19:20 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 01:20:55 compute-0 sshd-session[29913]: Received disconnect from 193.46.255.244 port 14524:11:  [preauth]
Nov 24 01:20:55 compute-0 sshd-session[29913]: Disconnected from authenticating user root 193.46.255.244 port 14524 [preauth]
Nov 24 01:21:06 compute-0 sshd-session[29915]: Connection closed by authenticating user root 185.156.73.233 port 43344 [preauth]
Nov 24 01:22:29 compute-0 sshd-session[29918]: Accepted publickey for zuul from 38.102.83.111 port 47848 ssh2: RSA SHA256:bpZNyMYYoO203pco3j6B+eO5t/MKKepnVkUBttgUZEY
Nov 24 01:22:29 compute-0 systemd-logind[791]: New session 6 of user zuul.
Nov 24 01:22:29 compute-0 systemd[1]: Started Session 6 of User zuul.
Nov 24 01:22:29 compute-0 sshd-session[29918]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:22:29 compute-0 python3[29994]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:22:32 compute-0 sudo[30108]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrxbgqoobbpddvsiwzpnenkithypdkus ; /usr/bin/python3'
Nov 24 01:22:32 compute-0 sudo[30108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:32 compute-0 python3[30110]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:32 compute-0 sudo[30108]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:32 compute-0 sudo[30181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksgnappjmwsjodkmfpkkfudfqlaycoht ; /usr/bin/python3'
Nov 24 01:22:32 compute-0 sudo[30181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:32 compute-0 python3[30183]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:32 compute-0 sudo[30181]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:32 compute-0 sudo[30207]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktfmpooyiroxmbkgpqhuaqoeyhdmealu ; /usr/bin/python3'
Nov 24 01:22:32 compute-0 sudo[30207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:33 compute-0 python3[30209]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:33 compute-0 sudo[30207]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:33 compute-0 sudo[30280]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egznxakyunofjyhbjbddlucltyeadaud ; /usr/bin/python3'
Nov 24 01:22:33 compute-0 sudo[30280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:33 compute-0 python3[30282]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:33 compute-0 sudo[30280]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:33 compute-0 sudo[30306]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmzhqfmnzailnpofmfzxomwctwgbnppf ; /usr/bin/python3'
Nov 24 01:22:33 compute-0 sudo[30306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:33 compute-0 python3[30308]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:33 compute-0 sudo[30306]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:33 compute-0 sudo[30379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyyyweesacnosmrymqttukvusgvdrqca ; /usr/bin/python3'
Nov 24 01:22:33 compute-0 sudo[30379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:33 compute-0 python3[30381]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:33 compute-0 sudo[30379]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:34 compute-0 sudo[30405]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghdsvcblvtxvqcejltbgrcghulspfsaq ; /usr/bin/python3'
Nov 24 01:22:34 compute-0 sudo[30405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:34 compute-0 python3[30407]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:34 compute-0 sudo[30405]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:34 compute-0 sudo[30478]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zseeumyxbknfeklejsjmqysztxvlxphr ; /usr/bin/python3'
Nov 24 01:22:34 compute-0 sudo[30478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:34 compute-0 python3[30480]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:34 compute-0 sudo[30478]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:34 compute-0 sudo[30505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndcdwkbgqesgsapbcbsiiojhrfgmaypj ; /usr/bin/python3'
Nov 24 01:22:34 compute-0 sudo[30505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:34 compute-0 python3[30507]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:34 compute-0 sudo[30505]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:34 compute-0 sudo[30578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxwxvanimziwgsehvshrjmnoauayciiw ; /usr/bin/python3'
Nov 24 01:22:34 compute-0 sudo[30578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:35 compute-0 python3[30580]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:35 compute-0 sudo[30578]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:35 compute-0 sudo[30604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqfvbrveyuyskpnqflkymjznkaebvpoc ; /usr/bin/python3'
Nov 24 01:22:35 compute-0 sudo[30604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:35 compute-0 python3[30606]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:35 compute-0 sudo[30604]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:35 compute-0 sudo[30677]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whvetxfxafhdlmpngigxqkleeqydyapo ; /usr/bin/python3'
Nov 24 01:22:35 compute-0 sudo[30677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:35 compute-0 python3[30679]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:35 compute-0 sudo[30677]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:35 compute-0 sudo[30703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zegpjbvjifxozmuwggcgafaabafotjsx ; /usr/bin/python3'
Nov 24 01:22:35 compute-0 sudo[30703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:36 compute-0 python3[30705]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 01:22:36 compute-0 sudo[30703]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:36 compute-0 sudo[30776]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssquasgljahvozihdtxbtpywdepikxx ; /usr/bin/python3'
Nov 24 01:22:36 compute-0 sudo[30776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:22:36 compute-0 python3[30778]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763947351.9594674-33563-123045176181801/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:22:36 compute-0 sudo[30776]: pam_unix(sudo:session): session closed for user root
Nov 24 01:22:38 compute-0 sshd-session[30805]: Unable to negotiate with 192.168.122.11 port 35918: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 24 01:22:38 compute-0 sshd-session[30803]: Connection closed by 192.168.122.11 port 35906 [preauth]
Nov 24 01:22:38 compute-0 sshd-session[30804]: Connection closed by 192.168.122.11 port 35908 [preauth]
Nov 24 01:22:38 compute-0 sshd-session[30806]: Unable to negotiate with 192.168.122.11 port 35924: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 24 01:22:38 compute-0 sshd-session[30807]: Unable to negotiate with 192.168.122.11 port 35940: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 24 01:22:47 compute-0 python3[30836]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:24:24 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 24 01:24:24 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 24 01:24:24 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 24 01:24:24 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 24 01:27:47 compute-0 sshd-session[29921]: Received disconnect from 38.102.83.111 port 47848:11: disconnected by user
Nov 24 01:27:47 compute-0 sshd-session[29921]: Disconnected from user zuul 38.102.83.111 port 47848
Nov 24 01:27:47 compute-0 sshd-session[29918]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:27:47 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 24 01:27:47 compute-0 systemd[1]: session-6.scope: Consumed 5.257s CPU time.
Nov 24 01:27:47 compute-0 systemd-logind[791]: Session 6 logged out. Waiting for processes to exit.
Nov 24 01:27:47 compute-0 systemd-logind[791]: Removed session 6.
Nov 24 01:32:31 compute-0 sshd-session[30847]: Connection closed by authenticating user root 185.156.73.233 port 49878 [preauth]
Nov 24 01:32:39 compute-0 sshd-session[30849]: Received disconnect from 46.188.119.26 port 55924:11: Bye Bye [preauth]
Nov 24 01:32:39 compute-0 sshd-session[30849]: Disconnected from authenticating user root 46.188.119.26 port 55924 [preauth]
Nov 24 01:33:27 compute-0 sshd-session[30851]: Accepted publickey for zuul from 192.168.122.30 port 38466 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:33:27 compute-0 systemd-logind[791]: New session 7 of user zuul.
Nov 24 01:33:27 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 24 01:33:27 compute-0 sshd-session[30851]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:33:29 compute-0 python3.9[31004]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:33:30 compute-0 sudo[31183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszmxiaahnjoyhumzlkuufcaacrkpvmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948009.647387-32-154363257095874/AnsiballZ_command.py'
Nov 24 01:33:30 compute-0 sudo[31183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:33:30 compute-0 python3.9[31185]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:33:37 compute-0 sudo[31183]: pam_unix(sudo:session): session closed for user root
Nov 24 01:33:38 compute-0 sshd-session[30854]: Connection closed by 192.168.122.30 port 38466
Nov 24 01:33:38 compute-0 sshd-session[30851]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:33:38 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 24 01:33:38 compute-0 systemd[1]: session-7.scope: Consumed 8.236s CPU time.
Nov 24 01:33:38 compute-0 systemd-logind[791]: Session 7 logged out. Waiting for processes to exit.
Nov 24 01:33:38 compute-0 systemd-logind[791]: Removed session 7.
Nov 24 01:33:43 compute-0 sshd-session[31244]: Accepted publickey for zuul from 192.168.122.30 port 50414 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:33:43 compute-0 systemd-logind[791]: New session 8 of user zuul.
Nov 24 01:33:43 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 24 01:33:43 compute-0 sshd-session[31244]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:33:44 compute-0 python3.9[31397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:33:44 compute-0 sshd-session[31247]: Connection closed by 192.168.122.30 port 50414
Nov 24 01:33:44 compute-0 sshd-session[31244]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:33:44 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 24 01:33:44 compute-0 systemd-logind[791]: Session 8 logged out. Waiting for processes to exit.
Nov 24 01:33:44 compute-0 systemd-logind[791]: Removed session 8.
Nov 24 01:34:00 compute-0 sshd-session[31424]: Accepted publickey for zuul from 192.168.122.30 port 47048 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:34:00 compute-0 systemd-logind[791]: New session 9 of user zuul.
Nov 24 01:34:00 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 24 01:34:00 compute-0 sshd-session[31424]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:34:01 compute-0 python3.9[31577]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 24 01:34:02 compute-0 python3.9[31751]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:34:03 compute-0 sudo[31902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswbephhdxbzqxjlybwfmeoofkagrozc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948042.7586439-45-170824207347246/AnsiballZ_command.py'
Nov 24 01:34:03 compute-0 sudo[31902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:03 compute-0 python3.9[31904]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:34:03 compute-0 sudo[31902]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:04 compute-0 sudo[32056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzsorttdruedjwukhhmlxeebgwinyhfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948043.6768177-57-275095420688242/AnsiballZ_stat.py'
Nov 24 01:34:04 compute-0 sudo[32056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:04 compute-0 python3.9[32058]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:34:04 compute-0 sudo[32056]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:04 compute-0 sudo[32210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqlluikzhzdnhsmytbfcygfofmlznki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948044.4801633-65-193906977091557/AnsiballZ_file.py'
Nov 24 01:34:04 compute-0 sudo[32210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:05 compute-0 python3.9[32212]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:34:05 compute-0 sudo[32210]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:05 compute-0 sshd-session[32135]: Invalid user ghost from 46.188.119.26 port 58090
Nov 24 01:34:05 compute-0 sshd-session[32135]: Received disconnect from 46.188.119.26 port 58090:11: Bye Bye [preauth]
Nov 24 01:34:05 compute-0 sshd-session[32135]: Disconnected from invalid user ghost 46.188.119.26 port 58090 [preauth]
Nov 24 01:34:05 compute-0 sudo[32362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jteormxjelxhpxcvnvrafsrqfplzovhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948045.3013756-73-117595846049311/AnsiballZ_stat.py'
Nov 24 01:34:05 compute-0 sudo[32362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:05 compute-0 python3.9[32364]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:34:05 compute-0 sudo[32362]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:06 compute-0 sudo[32486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxzogabzvcuithkcdojzorjohgwhrqql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948045.3013756-73-117595846049311/AnsiballZ_copy.py'
Nov 24 01:34:06 compute-0 sudo[32486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:06 compute-0 python3.9[32488]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948045.3013756-73-117595846049311/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:34:06 compute-0 sudo[32486]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:07 compute-0 sudo[32638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rldosogoomikqzppphffzscnzpusqpqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948046.7269015-88-249859349003252/AnsiballZ_setup.py'
Nov 24 01:34:07 compute-0 sudo[32638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:07 compute-0 python3.9[32640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:34:07 compute-0 sudo[32638]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:07 compute-0 sudo[32794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atbpjjndrtmqrmsjabxcchaguilhgnou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948047.6453857-96-40578041679311/AnsiballZ_file.py'
Nov 24 01:34:07 compute-0 sudo[32794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:08 compute-0 python3.9[32796]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:34:08 compute-0 sudo[32794]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:08 compute-0 sudo[32946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lipstqmwwaoogwvxyhnzpvnzzeocuygo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948048.237094-105-230224150676757/AnsiballZ_file.py'
Nov 24 01:34:08 compute-0 sudo[32946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:08 compute-0 python3.9[32948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:34:08 compute-0 sudo[32946]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:09 compute-0 python3.9[33098]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:34:13 compute-0 python3.9[33351]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:34:14 compute-0 python3.9[33501]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:34:15 compute-0 python3.9[33655]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:34:16 compute-0 sudo[33811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trkkdpqdtecvmfmjiwxyrykmmlsdxivv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948056.1897213-153-107717448322443/AnsiballZ_setup.py'
Nov 24 01:34:16 compute-0 sudo[33811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:16 compute-0 python3.9[33813]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:34:16 compute-0 sudo[33811]: pam_unix(sudo:session): session closed for user root
Nov 24 01:34:17 compute-0 sudo[33895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpxiodqoslqmpelhdhcsybvxsoglwsth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948056.1897213-153-107717448322443/AnsiballZ_dnf.py'
Nov 24 01:34:17 compute-0 sudo[33895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:34:17 compute-0 python3.9[33897]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:34:58 compute-0 systemd[1]: Reloading.
Nov 24 01:34:58 compute-0 systemd-rc-local-generator[34087]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:34:59 compute-0 systemd[1]: Starting dnf makecache...
Nov 24 01:34:59 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 24 01:34:59 compute-0 dnf[34097]: Failed determining last makecache time.
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-barbican-42b4c41831408a8e323 143 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 193 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-cinder-1c00d6490d88e436f26ef 194 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-stevedore-c4acc5639fd2329372142 191 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 systemd[1]: Reloading.
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-observabilityclient-2f31846d73c 165 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-os-net-config-bbae2ed8a159b0435a473f38 156 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 162 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-designate-tests-tempest-347fdbc 157 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 systemd-rc-local-generator[34132]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-glance-1fd12c29b339f30fe823e 203 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 176 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-manila-3c01b7181572c95dac462 193 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-whitebox-neutron-tests-tempest- 182 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-octavia-ba397f07a7331190208c 176 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-watcher-c014f81a8647287f6dcc 171 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-tcib-1124124ec06aadbac34f0d340b 168 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 157 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-swift-dc98a8463506ac520c469a 186 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-python-tempestconf-8515371b7cceebd4282 197 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 dnf[34097]: delorean-openstack-heat-ui-013accbfd179753bc3f0 200 kB/s | 3.0 kB     00:00
Nov 24 01:34:59 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 24 01:34:59 compute-0 systemd[1]: Reloading.
Nov 24 01:34:59 compute-0 systemd-rc-local-generator[34182]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:34:59 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 24 01:34:59 compute-0 dnf[34097]: CentOS Stream 9 - BaseOS                         28 kB/s | 7.3 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: CentOS Stream 9 - AppStream                      70 kB/s | 7.4 kB     00:00
Nov 24 01:35:00 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:35:00 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:35:00 compute-0 dnf[34097]: CentOS Stream 9 - CRB                            80 kB/s | 7.2 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: CentOS Stream 9 - Extras packages                79 kB/s | 8.3 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: dlrn-antelope-testing                            96 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: dlrn-antelope-build-deps                        142 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: centos9-rabbitmq                                116 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: centos9-storage                                  30 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: centos9-opstools                                 59 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: NFV SIG OpenvSwitch                             103 kB/s | 3.0 kB     00:00
Nov 24 01:35:00 compute-0 dnf[34097]: repo-setup-centos-appstream                      60 kB/s | 4.4 kB     00:00
Nov 24 01:35:01 compute-0 dnf[34097]: repo-setup-centos-baseos                        138 kB/s | 3.9 kB     00:00
Nov 24 01:35:01 compute-0 dnf[34097]: repo-setup-centos-highavailability              130 kB/s | 3.9 kB     00:00
Nov 24 01:35:01 compute-0 dnf[34097]: repo-setup-centos-powertools                    134 kB/s | 4.3 kB     00:00
Nov 24 01:35:01 compute-0 dnf[34097]: Extra Packages for Enterprise Linux 9 - x86_64  275 kB/s |  34 kB     00:00
Nov 24 01:35:01 compute-0 dnf[34097]: Metadata cache created.
Nov 24 01:35:01 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 24 01:35:01 compute-0 systemd[1]: Finished dnf makecache.
Nov 24 01:35:01 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.821s CPU time.
Nov 24 01:35:16 compute-0 sshd-session[34278]: Invalid user postgres from 46.188.119.26 port 58416
Nov 24 01:35:16 compute-0 sshd-session[34278]: Received disconnect from 46.188.119.26 port 58416:11: Bye Bye [preauth]
Nov 24 01:35:16 compute-0 sshd-session[34278]: Disconnected from invalid user postgres 46.188.119.26 port 58416 [preauth]
Nov 24 01:36:01 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:36:01 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:36:02 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 24 01:36:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:36:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:36:02 compute-0 systemd[1]: Reloading.
Nov 24 01:36:02 compute-0 systemd-rc-local-generator[34533]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:36:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:36:03 compute-0 sudo[33895]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:03 compute-0 sudo[35439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtnbgoesgmjrgxoftbhwfbfylvvvjdse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948163.2142975-165-51826699556037/AnsiballZ_command.py'
Nov 24 01:36:03 compute-0 sudo[35439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:36:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:36:03 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.237s CPU time.
Nov 24 01:36:03 compute-0 systemd[1]: run-r72adde2829e042fa919f93e68cdf2490.service: Deactivated successfully.
Nov 24 01:36:03 compute-0 python3.9[35441]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:05 compute-0 sudo[35439]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:06 compute-0 sudo[35721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmdyyrcegxjynffplspbwxqjsimaeit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948165.3505177-173-212233249992616/AnsiballZ_selinux.py'
Nov 24 01:36:06 compute-0 sudo[35721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:06 compute-0 sshd[1005]: Timeout before authentication for connection from 113.249.158.104 to 38.102.83.32, pid = 31799
Nov 24 01:36:06 compute-0 python3.9[35723]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 24 01:36:06 compute-0 sudo[35721]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:06 compute-0 sudo[35873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oicwgnnhhwdtapvmeynuvfbwgxvcifor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948166.6933248-184-236595249628312/AnsiballZ_command.py'
Nov 24 01:36:06 compute-0 sudo[35873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:07 compute-0 python3.9[35875]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 24 01:36:08 compute-0 sudo[35873]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:08 compute-0 sudo[36026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pogudaxypkcsflrmubpwplhfkwaualqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948168.274025-192-152617086484179/AnsiballZ_file.py'
Nov 24 01:36:08 compute-0 sudo[36026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:09 compute-0 python3.9[36028]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:36:09 compute-0 sudo[36026]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:10 compute-0 sudo[36178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzqhdtojvvzrpumxamtkdpqzfatkvtnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948169.4890213-200-144533796697960/AnsiballZ_mount.py'
Nov 24 01:36:10 compute-0 sudo[36178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:10 compute-0 python3.9[36180]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 24 01:36:10 compute-0 sudo[36178]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:11 compute-0 sudo[36330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itzhfgaijfhrznqzyuwwzlbgufckwpfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948171.0772645-228-15479438698838/AnsiballZ_file.py'
Nov 24 01:36:11 compute-0 sudo[36330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:11 compute-0 python3.9[36332]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:36:11 compute-0 sudo[36330]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:12 compute-0 sudo[36482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kskaiqqxgwhjptgipyhqhpsqxxwmyeed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948171.7278109-236-47355140843627/AnsiballZ_stat.py'
Nov 24 01:36:12 compute-0 sudo[36482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:12 compute-0 python3.9[36484]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:36:12 compute-0 sudo[36482]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:12 compute-0 sudo[36605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbnmctljtotbqvuayuwznccdnkzhtikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948171.7278109-236-47355140843627/AnsiballZ_copy.py'
Nov 24 01:36:12 compute-0 sudo[36605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:12 compute-0 python3.9[36607]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948171.7278109-236-47355140843627/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:36:12 compute-0 sudo[36605]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:13 compute-0 sudo[36757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fesfsaboyvupedbczdlekfaojkineaaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948173.435871-260-156181491423670/AnsiballZ_stat.py'
Nov 24 01:36:13 compute-0 sudo[36757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:16 compute-0 python3.9[36759]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:36:16 compute-0 sudo[36757]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:16 compute-0 sudo[36909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqgjrmewirhvhufpiqnrhxdidqpzftax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948176.3972125-268-139445507807434/AnsiballZ_command.py'
Nov 24 01:36:16 compute-0 sudo[36909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:16 compute-0 python3.9[36911]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:16 compute-0 sudo[36909]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:17 compute-0 sudo[37062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzwzzkbksppbhuljrgvcxkhgseejhavh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948177.1200597-276-187025577992034/AnsiballZ_file.py'
Nov 24 01:36:17 compute-0 sudo[37062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:17 compute-0 python3.9[37064]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:36:17 compute-0 sudo[37062]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:18 compute-0 sudo[37214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smpmmtqhwkhmgaypkhqbacrxmjdhxawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948178.0133457-287-61398867724598/AnsiballZ_getent.py'
Nov 24 01:36:18 compute-0 sudo[37214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:18 compute-0 python3.9[37216]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 24 01:36:18 compute-0 sudo[37214]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:18 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:36:19 compute-0 sudo[37368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lghsuiuysroptoietylkhcogpxjcwjab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948178.8712-295-105859423625894/AnsiballZ_group.py'
Nov 24 01:36:19 compute-0 sudo[37368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:19 compute-0 python3.9[37370]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:36:19 compute-0 groupadd[37371]: group added to /etc/group: name=qemu, GID=107
Nov 24 01:36:19 compute-0 groupadd[37371]: group added to /etc/gshadow: name=qemu
Nov 24 01:36:19 compute-0 groupadd[37371]: new group: name=qemu, GID=107
Nov 24 01:36:19 compute-0 sudo[37368]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:20 compute-0 sudo[37526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvnfuqlpymrbcexfmqniprowrugtrhns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948179.998421-303-177805139264592/AnsiballZ_user.py'
Nov 24 01:36:20 compute-0 sudo[37526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:20 compute-0 python3.9[37528]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 01:36:20 compute-0 useradd[37530]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 01:36:20 compute-0 sudo[37526]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:21 compute-0 sudo[37686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsyppyalvltjjympjomljrtdwjcezbrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948180.981021-311-208358956050223/AnsiballZ_getent.py'
Nov 24 01:36:21 compute-0 sudo[37686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:21 compute-0 python3.9[37688]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 24 01:36:21 compute-0 sudo[37686]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:21 compute-0 sudo[37839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuvazybnqrooaevhcpelpxnlcrtniceo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948181.7092643-319-62142914750825/AnsiballZ_group.py'
Nov 24 01:36:21 compute-0 sudo[37839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:22 compute-0 python3.9[37841]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:36:22 compute-0 groupadd[37842]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 24 01:36:22 compute-0 groupadd[37842]: group added to /etc/gshadow: name=hugetlbfs
Nov 24 01:36:22 compute-0 groupadd[37842]: new group: name=hugetlbfs, GID=42477
Nov 24 01:36:22 compute-0 sudo[37839]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:22 compute-0 sudo[37997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogiqrxnfkdadtwtojxrlbteokeservgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948182.4639661-328-235094477853362/AnsiballZ_file.py'
Nov 24 01:36:22 compute-0 sudo[37997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:22 compute-0 python3.9[37999]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 24 01:36:22 compute-0 sudo[37997]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:23 compute-0 sudo[38149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sckvzsrogbnseduszzzzkpphdbeeitxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948183.1886778-339-212545664107875/AnsiballZ_dnf.py'
Nov 24 01:36:23 compute-0 sudo[38149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:23 compute-0 python3.9[38151]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:36:25 compute-0 sudo[38149]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:25 compute-0 sudo[38302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgoxnulrkbxpoukaegxdmdzkymgobuyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948185.5882673-347-41561884003907/AnsiballZ_file.py'
Nov 24 01:36:25 compute-0 sudo[38302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:26 compute-0 python3.9[38304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:36:26 compute-0 sudo[38302]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:26 compute-0 sudo[38454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxssbqkaiglmqkcqlqndabnevjksykkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948186.2204418-355-44794938438258/AnsiballZ_stat.py'
Nov 24 01:36:26 compute-0 sudo[38454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:26 compute-0 python3.9[38456]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:36:26 compute-0 sudo[38454]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:27 compute-0 sudo[38577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pylyxyvydlorucolccqkdxfgdetknsxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948186.2204418-355-44794938438258/AnsiballZ_copy.py'
Nov 24 01:36:27 compute-0 sudo[38577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:27 compute-0 python3.9[38579]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948186.2204418-355-44794938438258/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:36:27 compute-0 sudo[38577]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:28 compute-0 sudo[38729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsctnjxlfwvkxbecoqkozxxgxxowwear ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948187.450378-370-63140328278920/AnsiballZ_systemd.py'
Nov 24 01:36:28 compute-0 sudo[38729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:28 compute-0 python3.9[38731]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:36:28 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 01:36:28 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 24 01:36:28 compute-0 kernel: Bridge firewalling registered
Nov 24 01:36:28 compute-0 systemd-modules-load[38735]: Inserted module 'br_netfilter'
Nov 24 01:36:28 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 01:36:28 compute-0 sudo[38729]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:28 compute-0 sudo[38890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocjfmqqgdsrovhzzahslebjjohrwusvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948188.681646-378-236670736065431/AnsiballZ_stat.py'
Nov 24 01:36:28 compute-0 sudo[38890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:29 compute-0 python3.9[38892]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:36:29 compute-0 sudo[38890]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:29 compute-0 sudo[39013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiofnqglxyxogzfxmhgdlhpzrdkvtbfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948188.681646-378-236670736065431/AnsiballZ_copy.py'
Nov 24 01:36:29 compute-0 sudo[39013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:29 compute-0 python3.9[39015]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948188.681646-378-236670736065431/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:36:29 compute-0 sudo[39013]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:30 compute-0 sudo[39165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwzhslfqgtbrmbglbtyfyjkrixvudeeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948189.9821835-396-97140713472928/AnsiballZ_dnf.py'
Nov 24 01:36:30 compute-0 sudo[39165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:30 compute-0 python3.9[39167]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:36:31 compute-0 sshd-session[39169]: Received disconnect from 46.188.119.26 port 58752:11: Bye Bye [preauth]
Nov 24 01:36:31 compute-0 sshd-session[39169]: Disconnected from authenticating user root 46.188.119.26 port 58752 [preauth]
Nov 24 01:36:33 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:36:33 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:36:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:36:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:36:34 compute-0 systemd[1]: Reloading.
Nov 24 01:36:34 compute-0 systemd-rc-local-generator[39232]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:36:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:36:34 compute-0 sudo[39165]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:35 compute-0 python3.9[40390]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:36:36 compute-0 python3.9[41256]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 24 01:36:36 compute-0 python3.9[41999]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:36:37 compute-0 sudo[42884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmwdhctvwznetajtuiqevevlahypterp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948197.1082346-435-199303856713313/AnsiballZ_command.py'
Nov 24 01:36:37 compute-0 sudo[42884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:37 compute-0 python3.9[42895]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:37 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 01:36:38 compute-0 systemd[1]: Starting Authorization Manager...
Nov 24 01:36:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:36:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:36:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.934s CPU time.
Nov 24 01:36:38 compute-0 systemd[1]: run-rd1ba733ac9834d06b2a7d22392f0ff6d.service: Deactivated successfully.
Nov 24 01:36:38 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 01:36:38 compute-0 polkitd[43547]: Started polkitd version 0.117
Nov 24 01:36:38 compute-0 polkitd[43547]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 01:36:38 compute-0 polkitd[43547]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 01:36:38 compute-0 polkitd[43547]: Finished loading, compiling and executing 2 rules
Nov 24 01:36:38 compute-0 systemd[1]: Started Authorization Manager.
Nov 24 01:36:38 compute-0 polkitd[43547]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 24 01:36:38 compute-0 sudo[42884]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:38 compute-0 sudo[43716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqcoubhjgafjgbbctalwmgkmrfzrenjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948198.5169318-444-11477367385268/AnsiballZ_systemd.py'
Nov 24 01:36:38 compute-0 sudo[43716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:39 compute-0 python3.9[43718]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:36:39 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 24 01:36:39 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 24 01:36:39 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 24 01:36:39 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 01:36:39 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 01:36:39 compute-0 sudo[43716]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:40 compute-0 python3.9[43879]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 24 01:36:40 compute-0 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 24 01:36:40 compute-0 irqbalance[784]: IRQ 26 affinity is now unmanaged
Nov 24 01:36:42 compute-0 sudo[44029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eikbblbxjjuikxhnfcjsebvgpdnygkyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948201.7403693-501-83081092924608/AnsiballZ_systemd.py'
Nov 24 01:36:42 compute-0 sudo[44029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:42 compute-0 python3.9[44031]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:36:42 compute-0 systemd[1]: Reloading.
Nov 24 01:36:42 compute-0 systemd-rc-local-generator[44057]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:36:42 compute-0 sudo[44029]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:43 compute-0 sudo[44218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcuphxboeqbzwfgvgomivhvttdvpzocl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948202.8169682-501-67916521897413/AnsiballZ_systemd.py'
Nov 24 01:36:43 compute-0 sudo[44218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:43 compute-0 python3.9[44220]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:36:43 compute-0 systemd[1]: Reloading.
Nov 24 01:36:43 compute-0 systemd-rc-local-generator[44249]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:36:43 compute-0 sudo[44218]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:44 compute-0 sudo[44408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhyouksxkfsavqcnqnrjyrydbdoxevxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948203.9493709-517-55422632126483/AnsiballZ_command.py'
Nov 24 01:36:44 compute-0 sudo[44408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:44 compute-0 python3.9[44410]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:44 compute-0 sudo[44408]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:44 compute-0 sudo[44561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bedqdsenercowgtdwbhrmcksyihixnhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948204.6294978-525-274756323873613/AnsiballZ_command.py'
Nov 24 01:36:44 compute-0 sudo[44561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:45 compute-0 python3.9[44563]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:45 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 24 01:36:45 compute-0 sudo[44561]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:45 compute-0 sudo[44714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nraquowdaxbdpdwvylxdtnayemehxwbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948205.3293204-533-67142760685171/AnsiballZ_command.py'
Nov 24 01:36:45 compute-0 sudo[44714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:45 compute-0 python3.9[44716]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:47 compute-0 sudo[44714]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:47 compute-0 sudo[44876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrtcuzuxzbpfttyfzyalplkgubhbzjgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948207.4720545-541-278936704924313/AnsiballZ_command.py'
Nov 24 01:36:47 compute-0 sudo[44876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:47 compute-0 python3.9[44878]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:47 compute-0 sudo[44876]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:48 compute-0 sudo[45029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpnsylwnmbijdvppkjnbaoxbxhraouxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948208.1304202-549-38783641710435/AnsiballZ_systemd.py'
Nov 24 01:36:48 compute-0 sudo[45029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:48 compute-0 python3.9[45031]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:36:48 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 01:36:48 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 24 01:36:48 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 24 01:36:48 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 24 01:36:48 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 01:36:48 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 24 01:36:48 compute-0 sudo[45029]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:49 compute-0 sshd-session[31427]: Connection closed by 192.168.122.30 port 47048
Nov 24 01:36:49 compute-0 sshd-session[31424]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:36:49 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 24 01:36:49 compute-0 systemd[1]: session-9.scope: Consumed 2min 13.918s CPU time.
Nov 24 01:36:49 compute-0 systemd-logind[791]: Session 9 logged out. Waiting for processes to exit.
Nov 24 01:36:49 compute-0 systemd-logind[791]: Removed session 9.
Nov 24 01:36:54 compute-0 sshd-session[45061]: Accepted publickey for zuul from 192.168.122.30 port 37484 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:36:54 compute-0 systemd-logind[791]: New session 10 of user zuul.
Nov 24 01:36:54 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 24 01:36:54 compute-0 sshd-session[45061]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:36:55 compute-0 python3.9[45214]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:36:56 compute-0 python3.9[45368]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:36:57 compute-0 sudo[45522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwsreeyxoovzezcojvrbhdcmbacsetk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948216.9441328-50-143952993555102/AnsiballZ_command.py'
Nov 24 01:36:57 compute-0 sudo[45522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:57 compute-0 python3.9[45524]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:36:57 compute-0 sudo[45522]: pam_unix(sudo:session): session closed for user root
Nov 24 01:36:58 compute-0 python3.9[45675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:36:59 compute-0 sudo[45829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkbuhldmdhgrpxvzzmyiesaajrewiwab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948218.9383812-70-149062605922736/AnsiballZ_setup.py'
Nov 24 01:36:59 compute-0 sudo[45829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:36:59 compute-0 python3.9[45831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:36:59 compute-0 sudo[45829]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:00 compute-0 sudo[45913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rupubkzhvtbvjrvwetqnmxrpyklmqcsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948218.9383812-70-149062605922736/AnsiballZ_dnf.py'
Nov 24 01:37:00 compute-0 sudo[45913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:00 compute-0 python3.9[45915]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:37:01 compute-0 sudo[45913]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:02 compute-0 sudo[46066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbtgjjrmkiwbarzpjozcjsmihufahfjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948221.8642507-82-246767581686044/AnsiballZ_setup.py'
Nov 24 01:37:02 compute-0 sudo[46066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:02 compute-0 python3.9[46068]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:37:02 compute-0 sudo[46066]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:03 compute-0 sudo[46237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufflwanvvogfuddqdxitsxtdxuaswoqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948222.8249516-93-170124626166718/AnsiballZ_file.py'
Nov 24 01:37:03 compute-0 sudo[46237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:03 compute-0 python3.9[46239]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:37:03 compute-0 sudo[46237]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:03 compute-0 sudo[46389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbshgfpmfaffswbcdfqyfjaccxhjomdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948223.73224-101-207368754505450/AnsiballZ_command.py'
Nov 24 01:37:03 compute-0 sudo[46389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:04 compute-0 python3.9[46391]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:37:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3874991740-merged.mount: Deactivated successfully.
Nov 24 01:37:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1324271133-merged.mount: Deactivated successfully.
Nov 24 01:37:04 compute-0 podman[46392]: 2025-11-24 01:37:04.2712999 +0000 UTC m=+0.073610703 system refresh
Nov 24 01:37:04 compute-0 sudo[46389]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:05 compute-0 sudo[46551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mullkpwwrlwwybfrkkhvcevbmuobxbhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948224.5131857-109-267409926536907/AnsiballZ_stat.py'
Nov 24 01:37:05 compute-0 sudo[46551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:05 compute-0 python3.9[46553]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:37:05 compute-0 sudo[46551]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:37:05 compute-0 sudo[46674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqdjucyncdoehadwzjlcbmjwogmktpqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948224.5131857-109-267409926536907/AnsiballZ_copy.py'
Nov 24 01:37:05 compute-0 sudo[46674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:05 compute-0 python3.9[46676]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948224.5131857-109-267409926536907/.source.json follow=False _original_basename=podman_network_config.j2 checksum=05c4d6456785acf69d26ef15f179712e624a1334 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:37:05 compute-0 sudo[46674]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:06 compute-0 sudo[46826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idveundhjjzprayogdfddiecwceujkhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948226.1203384-124-39511601677506/AnsiballZ_stat.py'
Nov 24 01:37:06 compute-0 sudo[46826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:06 compute-0 python3.9[46828]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:37:06 compute-0 sudo[46826]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:06 compute-0 sudo[46949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfrnlicxoyxbirxhhvmrxrmoyytnnwzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948226.1203384-124-39511601677506/AnsiballZ_copy.py'
Nov 24 01:37:06 compute-0 sudo[46949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:07 compute-0 python3.9[46951]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948226.1203384-124-39511601677506/.source.conf follow=False _original_basename=registries.conf.j2 checksum=888b975826b2c6c0439200ce8ac9219b96c0abdf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:37:07 compute-0 sudo[46949]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:07 compute-0 sudo[47101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuejrvpypdqhctikoxujwipmzgiykwvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948227.437041-140-227878418257834/AnsiballZ_ini_file.py'
Nov 24 01:37:07 compute-0 sudo[47101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:08 compute-0 python3.9[47103]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:37:08 compute-0 sudo[47101]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:08 compute-0 sudo[47253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnwhxqszpwsybsipafhbepqowtxedqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948228.2660444-140-223880575254243/AnsiballZ_ini_file.py'
Nov 24 01:37:08 compute-0 sudo[47253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:08 compute-0 python3.9[47255]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:37:08 compute-0 sudo[47253]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:09 compute-0 sudo[47405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zslcujyueopvgtiptoqomratosmqiskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948228.901162-140-52895632636467/AnsiballZ_ini_file.py'
Nov 24 01:37:09 compute-0 sudo[47405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:09 compute-0 python3.9[47407]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:37:09 compute-0 sudo[47405]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:09 compute-0 sudo[47557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhgykjvymqfaoabosrcayiiesznoliat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948229.576769-140-125175761755777/AnsiballZ_ini_file.py'
Nov 24 01:37:09 compute-0 sudo[47557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:10 compute-0 python3.9[47559]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:37:10 compute-0 sudo[47557]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:11 compute-0 python3.9[47709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:37:11 compute-0 sudo[47861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrxjnrqsbyncdklzfshpfwiyscnhopnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948231.2352176-180-222042733084632/AnsiballZ_dnf.py'
Nov 24 01:37:11 compute-0 sudo[47861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:11 compute-0 python3.9[47863]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:13 compute-0 sudo[47861]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:13 compute-0 sudo[48014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtoaydpntgyuqhnqtqtutuvlvllbvxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948233.2756855-188-159378924530731/AnsiballZ_dnf.py'
Nov 24 01:37:13 compute-0 sudo[48014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:13 compute-0 python3.9[48016]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:15 compute-0 sudo[48014]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:15 compute-0 sudo[48174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jugecbzjxmyftdpswtnlpmcjxchgjqdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948235.6543093-198-238949567285182/AnsiballZ_dnf.py'
Nov 24 01:37:15 compute-0 sudo[48174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:16 compute-0 python3.9[48176]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:17 compute-0 sudo[48174]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:17 compute-0 sudo[48327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twldawlltgdpoucqvcaqouosgprwiywq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948237.6452816-207-201152599652256/AnsiballZ_dnf.py'
Nov 24 01:37:17 compute-0 sudo[48327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:18 compute-0 python3.9[48329]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:19 compute-0 sudo[48327]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:20 compute-0 sudo[48480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sowmpxdxlxfjuabzkmmeiozlskwgmure ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948239.903038-218-237776183326573/AnsiballZ_dnf.py'
Nov 24 01:37:20 compute-0 sudo[48480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:20 compute-0 python3.9[48482]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:21 compute-0 sudo[48480]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:22 compute-0 sudo[48636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igreypvfywtcrzhioqzoielxrwutjwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948242.0381062-226-279918467154056/AnsiballZ_dnf.py'
Nov 24 01:37:22 compute-0 sudo[48636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:22 compute-0 python3.9[48638]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:22 compute-0 sshd-session[48640]: Connection closed by 159.65.46.209 port 56922
Nov 24 01:37:25 compute-0 sudo[48636]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:25 compute-0 sudo[48807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaedemwwzfeunnedesrvogytvvvnffhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948245.3401344-235-159184875049987/AnsiballZ_dnf.py'
Nov 24 01:37:25 compute-0 sudo[48807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:25 compute-0 python3.9[48809]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:27 compute-0 sudo[48807]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:27 compute-0 sudo[48960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owwxawxgvfzfqubsfgveenkojxgwrvvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948247.3458676-244-127009495294860/AnsiballZ_dnf.py'
Nov 24 01:37:27 compute-0 sudo[48960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:27 compute-0 python3.9[48962]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:41 compute-0 sudo[48960]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:42 compute-0 sudo[49296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szjupnoafoifvycjlqivnmhcphizvhuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948261.749429-253-222088547455378/AnsiballZ_dnf.py'
Nov 24 01:37:42 compute-0 sudo[49296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:42 compute-0 python3.9[49298]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:37:43 compute-0 sudo[49296]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:44 compute-0 sudo[49452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxezauhtamxdhzscgirfbawqemrnhbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948263.9195328-264-140260764248126/AnsiballZ_file.py'
Nov 24 01:37:44 compute-0 sudo[49452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:44 compute-0 python3.9[49454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:37:44 compute-0 sudo[49452]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:45 compute-0 sudo[49627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsvgehlwpxqbiijcscboxuzdftehdtpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948264.5327222-272-117309590373400/AnsiballZ_stat.py'
Nov 24 01:37:45 compute-0 sudo[49627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:45 compute-0 python3.9[49629]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:37:45 compute-0 sudo[49627]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:45 compute-0 sudo[49750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyuodbypxkktjqymquigpfevlkhuidq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948264.5327222-272-117309590373400/AnsiballZ_copy.py'
Nov 24 01:37:45 compute-0 sudo[49750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:45 compute-0 python3.9[49752]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763948264.5327222-272-117309590373400/.source.json _original_basename=.ir6mg8_r follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:37:45 compute-0 sudo[49750]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:46 compute-0 sudo[49902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bndwuxajjyedfddwlakvsfjvskebtjoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948266.022128-290-43285504888231/AnsiballZ_podman_image.py'
Nov 24 01:37:46 compute-0 sudo[49902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:46 compute-0 python3.9[49904]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:37:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:37:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1204643060-lower\x2dmapped.mount: Deactivated successfully.
Nov 24 01:37:49 compute-0 sshd-session[49967]: Received disconnect from 46.188.119.26 port 59082:11: Bye Bye [preauth]
Nov 24 01:37:49 compute-0 sshd-session[49967]: Disconnected from authenticating user root 46.188.119.26 port 59082 [preauth]
Nov 24 01:37:51 compute-0 podman[49916]: 2025-11-24 01:37:51.843662192 +0000 UTC m=+5.029298207 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 01:37:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:37:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:37:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:37:52 compute-0 sudo[49902]: pam_unix(sudo:session): session closed for user root
Nov 24 01:37:52 compute-0 sudo[50215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txfvejlcfqeximyltaeasljiahiqiwrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948272.4013534-301-56740530003379/AnsiballZ_podman_image.py'
Nov 24 01:37:52 compute-0 sudo[50215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:37:52 compute-0 python3.9[50217]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:37:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:01 compute-0 podman[50229]: 2025-11-24 01:38:01.909586486 +0000 UTC m=+8.923339287 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:38:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:02 compute-0 sudo[50215]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:02 compute-0 sudo[50524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndgeimbxedzwisyyltcyztjzqeencvlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948282.4041178-311-80654012822912/AnsiballZ_podman_image.py'
Nov 24 01:38:02 compute-0 sudo[50524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:02 compute-0 python3.9[50526]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:38:04 compute-0 podman[50538]: 2025-11-24 01:38:04.096865819 +0000 UTC m=+1.168523045 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 01:38:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:04 compute-0 sudo[50524]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:04 compute-0 sudo[50772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oobcfwqmgsbqhiyypmeyrvihbxhqpdzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948284.556979-320-127079714876642/AnsiballZ_podman_image.py'
Nov 24 01:38:04 compute-0 sudo[50772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:05 compute-0 python3.9[50774]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:38:13 compute-0 podman[50786]: 2025-11-24 01:38:13.746350973 +0000 UTC m=+8.664083412 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 01:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:13 compute-0 sudo[50772]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:14 compute-0 sudo[51040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whtnsrsgkbenxohiwfpoelyqitezgcvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948294.3726676-331-203148705669496/AnsiballZ_podman_image.py'
Nov 24 01:38:14 compute-0 sudo[51040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:14 compute-0 python3.9[51042]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:38:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:18 compute-0 podman[51052]: 2025-11-24 01:38:18.471207599 +0000 UTC m=+3.566230312 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 24 01:38:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:18 compute-0 sudo[51040]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:19 compute-0 sudo[51304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fglaeqxdumaqihhjsjklutweieskooaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948298.8423684-331-45228963912696/AnsiballZ_podman_image.py'
Nov 24 01:38:19 compute-0 sudo[51304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:19 compute-0 python3.9[51306]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 01:38:21 compute-0 podman[51318]: 2025-11-24 01:38:21.457277829 +0000 UTC m=+2.087569212 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 24 01:38:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:38:21 compute-0 sudo[51304]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:22 compute-0 sshd-session[45064]: Connection closed by 192.168.122.30 port 37484
Nov 24 01:38:22 compute-0 sshd-session[45061]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:38:22 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 24 01:38:22 compute-0 systemd[1]: session-10.scope: Consumed 1min 47.387s CPU time.
Nov 24 01:38:22 compute-0 systemd-logind[791]: Session 10 logged out. Waiting for processes to exit.
Nov 24 01:38:22 compute-0 systemd-logind[791]: Removed session 10.
Nov 24 01:38:27 compute-0 sshd-session[51463]: Accepted publickey for zuul from 192.168.122.30 port 35276 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:38:27 compute-0 systemd-logind[791]: New session 11 of user zuul.
Nov 24 01:38:27 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 24 01:38:27 compute-0 sshd-session[51463]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:38:28 compute-0 python3.9[51616]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:38:29 compute-0 sudo[51770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpswwedubpymvmaddpigfpcejrnvrspk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948308.869821-36-142001887028985/AnsiballZ_getent.py'
Nov 24 01:38:29 compute-0 sudo[51770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:29 compute-0 python3.9[51772]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 24 01:38:29 compute-0 sudo[51770]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:30 compute-0 sudo[51925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyyuhztxziajhbpmhrwzhwhcxiilmobl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948309.652214-44-180787446983446/AnsiballZ_group.py'
Nov 24 01:38:30 compute-0 sudo[51925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:30 compute-0 python3.9[51927]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:38:30 compute-0 groupadd[51928]: group added to /etc/group: name=openvswitch, GID=42476
Nov 24 01:38:30 compute-0 groupadd[51928]: group added to /etc/gshadow: name=openvswitch
Nov 24 01:38:30 compute-0 groupadd[51928]: new group: name=openvswitch, GID=42476
Nov 24 01:38:30 compute-0 sudo[51925]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:30 compute-0 sshd-session[51850]: Received disconnect from 193.46.255.7 port 28526:11:  [preauth]
Nov 24 01:38:30 compute-0 sshd-session[51850]: Disconnected from authenticating user root 193.46.255.7 port 28526 [preauth]
Nov 24 01:38:31 compute-0 sudo[52083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oujfyfyepnokrnzyhmehyhldoijowygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948310.5501282-52-41852363208868/AnsiballZ_user.py'
Nov 24 01:38:31 compute-0 sudo[52083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:31 compute-0 python3.9[52085]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 01:38:31 compute-0 useradd[52087]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 01:38:31 compute-0 useradd[52087]: add 'openvswitch' to group 'hugetlbfs'
Nov 24 01:38:31 compute-0 useradd[52087]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 24 01:38:31 compute-0 sudo[52083]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:31 compute-0 sudo[52243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymgqprwtsnqknvnrpvcpldguzrmtorzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948311.605577-62-155202078158450/AnsiballZ_setup.py'
Nov 24 01:38:31 compute-0 sudo[52243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:32 compute-0 python3.9[52245]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:38:32 compute-0 sudo[52243]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:32 compute-0 sudo[52327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auxvjaadxfizqwbwbnpgvtexupqijsyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948311.605577-62-155202078158450/AnsiballZ_dnf.py'
Nov 24 01:38:32 compute-0 sudo[52327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:33 compute-0 python3.9[52329]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:38:34 compute-0 sudo[52327]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:35 compute-0 sudo[52488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpucevxtezjswcedlgkohmyteboqnabe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948315.0350738-76-10508979396224/AnsiballZ_dnf.py'
Nov 24 01:38:35 compute-0 sudo[52488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:35 compute-0 python3.9[52490]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:38:48 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:38:48 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:38:48 compute-0 groupadd[52514]: group added to /etc/group: name=unbound, GID=993
Nov 24 01:38:48 compute-0 groupadd[52514]: group added to /etc/gshadow: name=unbound
Nov 24 01:38:48 compute-0 groupadd[52514]: new group: name=unbound, GID=993
Nov 24 01:38:48 compute-0 useradd[52521]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 24 01:38:48 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 24 01:38:48 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 24 01:38:49 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:38:49 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:38:49 compute-0 systemd[1]: Reloading.
Nov 24 01:38:49 compute-0 systemd-sysv-generator[53024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:38:49 compute-0 systemd-rc-local-generator[53020]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:38:50 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:38:50 compute-0 sudo[52488]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:50 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:38:50 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:38:50 compute-0 systemd[1]: run-r4ec5fc23e09e4dec9b519504b6177a0f.service: Deactivated successfully.
Nov 24 01:38:51 compute-0 sudo[53587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuputumjrzpfbdboipzxxzyjwwnmsujw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948330.798767-84-86492551656746/AnsiballZ_systemd.py'
Nov 24 01:38:51 compute-0 sudo[53587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:51 compute-0 python3.9[53589]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:38:51 compute-0 systemd[1]: Reloading.
Nov 24 01:38:51 compute-0 systemd-sysv-generator[53622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:38:51 compute-0 systemd-rc-local-generator[53615]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:38:51 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 24 01:38:51 compute-0 chown[53631]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 24 01:38:52 compute-0 ovs-ctl[53636]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 24 01:38:52 compute-0 ovs-ctl[53636]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 24 01:38:52 compute-0 ovs-ctl[53636]: Starting ovsdb-server [  OK  ]
Nov 24 01:38:52 compute-0 ovs-vsctl[53685]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 24 01:38:52 compute-0 ovs-vsctl[53705]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"e8ad7b7b-7799-4041-b082-e8facd56e34a\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 24 01:38:52 compute-0 ovs-ctl[53636]: Configuring Open vSwitch system IDs [  OK  ]
Nov 24 01:38:52 compute-0 ovs-vsctl[53711]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 24 01:38:52 compute-0 ovs-ctl[53636]: Enabling remote OVSDB managers [  OK  ]
Nov 24 01:38:52 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 24 01:38:52 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 24 01:38:52 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 24 01:38:52 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 24 01:38:52 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 24 01:38:52 compute-0 ovs-ctl[53756]: Inserting openvswitch module [  OK  ]
Nov 24 01:38:52 compute-0 ovs-ctl[53725]: Starting ovs-vswitchd [  OK  ]
Nov 24 01:38:52 compute-0 ovs-vsctl[53773]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 24 01:38:52 compute-0 ovs-ctl[53725]: Enabling remote OVSDB managers [  OK  ]
Nov 24 01:38:52 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 24 01:38:52 compute-0 systemd[1]: Starting Open vSwitch...
Nov 24 01:38:52 compute-0 systemd[1]: Finished Open vSwitch.
Nov 24 01:38:52 compute-0 sudo[53587]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:53 compute-0 python3.9[53925]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:38:54 compute-0 sudo[54075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhgjbfzqwpbjbpgemgkocwktkmbddkor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948333.6252363-102-176871329922123/AnsiballZ_sefcontext.py'
Nov 24 01:38:54 compute-0 sudo[54075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:54 compute-0 python3.9[54077]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 24 01:38:55 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:38:55 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:38:55 compute-0 sudo[54075]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:56 compute-0 python3.9[54232]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:38:57 compute-0 sudo[54388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfnsdgpcbibxbbkeqmaddzzkprezqpap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948337.0157273-120-179934336067543/AnsiballZ_dnf.py'
Nov 24 01:38:57 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 24 01:38:57 compute-0 sudo[54388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:57 compute-0 python3.9[54390]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:38:58 compute-0 sudo[54388]: pam_unix(sudo:session): session closed for user root
Nov 24 01:38:59 compute-0 sudo[54541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksigxrabjghdbrkcuntdvcxlvxcuoqkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948338.9837766-128-204169855909224/AnsiballZ_command.py'
Nov 24 01:38:59 compute-0 sudo[54541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:38:59 compute-0 python3.9[54543]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:39:00 compute-0 sudo[54541]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:00 compute-0 sudo[54828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngkmmfbdksxxcggicnpwgmbbydcpwebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948340.5434165-136-218003957613098/AnsiballZ_file.py'
Nov 24 01:39:00 compute-0 sudo[54828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:01 compute-0 python3.9[54830]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 01:39:01 compute-0 sudo[54828]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:01 compute-0 python3.9[54980]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:39:02 compute-0 sudo[55132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxkfmffoutwlmasewxfxujxgmmmzegbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948342.0561283-152-252112090955695/AnsiballZ_dnf.py'
Nov 24 01:39:02 compute-0 sudo[55132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:02 compute-0 python3.9[55134]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:39:04 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:39:04 compute-0 sshd-session[55136]: Invalid user aa from 46.188.119.26 port 59410
Nov 24 01:39:04 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:39:04 compute-0 systemd[1]: Reloading.
Nov 24 01:39:04 compute-0 systemd-sysv-generator[55180]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:39:04 compute-0 systemd-rc-local-generator[55177]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:39:04 compute-0 sshd-session[55136]: Received disconnect from 46.188.119.26 port 59410:11: Bye Bye [preauth]
Nov 24 01:39:04 compute-0 sshd-session[55136]: Disconnected from invalid user aa 46.188.119.26 port 59410 [preauth]
Nov 24 01:39:04 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:39:04 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:39:04 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:39:04 compute-0 systemd[1]: run-r91697c52fd56411b912fc79ccdeaee90.service: Deactivated successfully.
Nov 24 01:39:05 compute-0 sudo[55132]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:05 compute-0 sudo[55451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gonjzeifersdcsaqxopbtyghqguzgnis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948345.2385497-160-30327840362305/AnsiballZ_systemd.py'
Nov 24 01:39:05 compute-0 sudo[55451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:05 compute-0 python3.9[55453]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:39:05 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 01:39:05 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 24 01:39:05 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 24 01:39:05 compute-0 systemd[1]: Stopping Network Manager...
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8637] caught SIGTERM, shutting down normally.
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8655] dhcp4 (eth0): canceled DHCP transaction
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8656] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8656] dhcp4 (eth0): state changed no lease
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8658] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 01:39:05 compute-0 NetworkManager[7185]: <info>  [1763948345.8715] exiting (success)
Nov 24 01:39:05 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:39:05 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 01:39:05 compute-0 systemd[1]: Stopped Network Manager.
Nov 24 01:39:05 compute-0 systemd[1]: NetworkManager.service: Consumed 11.904s CPU time, 4.1M memory peak, read 0B from disk, written 11.0K to disk.
Nov 24 01:39:05 compute-0 systemd[1]: Starting Network Manager...
Nov 24 01:39:05 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:39:05 compute-0 NetworkManager[55458]: <info>  [1763948345.9222] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c54d865c-1bf1-4cad-ad82-0976a3ee1591)
Nov 24 01:39:05 compute-0 NetworkManager[55458]: <info>  [1763948345.9225] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 01:39:05 compute-0 NetworkManager[55458]: <info>  [1763948345.9276] manager[0x55b80fa08090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 01:39:05 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 01:39:06 compute-0 systemd[1]: Started Hostname Service.
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0039] hostname: hostname: using hostnamed
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0040] hostname: static hostname changed from (none) to "compute-0"
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0043] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0047] manager[0x55b80fa08090]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0047] manager[0x55b80fa08090]: rfkill: WWAN hardware radio set enabled
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0064] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0072] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0073] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0073] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0074] manager: Networking is enabled by state file
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0075] settings: Loaded settings plugin: keyfile (internal)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0078] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0097] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0106] dhcp: init: Using DHCP client 'internal'
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0109] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0113] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0120] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0125] device (lo): Activation: starting connection 'lo' (da7d2480-bf68-42ff-860c-6b8f4466c871)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0131] device (eth0): carrier: link connected
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0134] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0137] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0138] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0143] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0148] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0153] device (eth1): carrier: link connected
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0156] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0160] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (e2301243-2a8f-5270-a46e-0de358c9532a) (indicated)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0160] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0164] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0169] device (eth1): Activation: starting connection 'ci-private-network' (e2301243-2a8f-5270-a46e-0de358c9532a)
Nov 24 01:39:06 compute-0 systemd[1]: Started Network Manager.
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0174] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0181] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0182] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0184] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0185] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0188] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0190] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0192] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0194] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0200] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0202] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0209] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0218] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0224] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0226] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0231] device (lo): Activation: successful, device activated.
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0236] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0238] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0241] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0244] device (eth1): Activation: successful, device activated.
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0252] dhcp4 (eth0): state changed new lease, address=38.102.83.32
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0257] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 01:39:06 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0334] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0374] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0377] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0383] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0387] device (eth0): Activation: successful, device activated.
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0395] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 01:39:06 compute-0 NetworkManager[55458]: <info>  [1763948346.0419] manager: startup complete
Nov 24 01:39:06 compute-0 sudo[55451]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:06 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 24 01:39:06 compute-0 sudo[55677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixufinzhdkjiosvttuohcwqnzvdngbvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948346.2407575-168-73156481891843/AnsiballZ_dnf.py'
Nov 24 01:39:06 compute-0 sudo[55677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:06 compute-0 python3.9[55679]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:39:11 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:39:11 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:39:11 compute-0 systemd[1]: Reloading.
Nov 24 01:39:11 compute-0 systemd-sysv-generator[55737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:39:11 compute-0 systemd-rc-local-generator[55733]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:39:11 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:39:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:39:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:39:11 compute-0 systemd[1]: run-rc623b787f0434ab381454199a6d17998.service: Deactivated successfully.
Nov 24 01:39:12 compute-0 sudo[55677]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:12 compute-0 sudo[56137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmhuafyjtgprxxexnhkwgclbpgkmhbdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948352.5040534-180-30744736699976/AnsiballZ_stat.py'
Nov 24 01:39:12 compute-0 sudo[56137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:12 compute-0 python3.9[56139]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:39:12 compute-0 sudo[56137]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:13 compute-0 sudo[56289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpqamfcnkatkigztnbtvwycamdqarigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948353.16953-189-230788431844152/AnsiballZ_ini_file.py'
Nov 24 01:39:13 compute-0 sudo[56289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:13 compute-0 python3.9[56291]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:13 compute-0 sudo[56289]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:14 compute-0 sudo[56443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbgpqifusdzwrccglzuyagvouobtrqgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948353.9509897-199-125121373027409/AnsiballZ_ini_file.py'
Nov 24 01:39:14 compute-0 sudo[56443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:14 compute-0 python3.9[56445]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:14 compute-0 sudo[56443]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:14 compute-0 sudo[56595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nldfqndibczworhwxkznizmuyllkpdyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948354.512751-199-247075458278834/AnsiballZ_ini_file.py'
Nov 24 01:39:14 compute-0 sudo[56595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:14 compute-0 python3.9[56597]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:14 compute-0 sudo[56595]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:15 compute-0 sudo[56747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfesdfgmdzldlxqmeudocfzfiabglhau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948355.1637318-214-255775353189899/AnsiballZ_ini_file.py'
Nov 24 01:39:15 compute-0 sudo[56747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:15 compute-0 python3.9[56749]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:15 compute-0 sudo[56747]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:16 compute-0 sudo[56899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyylwxschfphbwlbhnlrsqerbdolvoie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948355.7835495-214-202685656936062/AnsiballZ_ini_file.py'
Nov 24 01:39:16 compute-0 sudo[56899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:16 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:39:16 compute-0 python3.9[56901]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:16 compute-0 sudo[56899]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:16 compute-0 sudo[57051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaxqetzwskehycggouagpquaxqvcxifv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948356.402185-229-246360071345970/AnsiballZ_stat.py'
Nov 24 01:39:16 compute-0 sudo[57051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:16 compute-0 python3.9[57053]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:39:16 compute-0 sudo[57051]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:17 compute-0 sudo[57174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnnjifghztlcjrdllkolbvdicvwmlnnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948356.402185-229-246360071345970/AnsiballZ_copy.py'
Nov 24 01:39:17 compute-0 sudo[57174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:17 compute-0 python3.9[57176]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948356.402185-229-246360071345970/.source _original_basename=.ais1leub follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:17 compute-0 sudo[57174]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:18 compute-0 sudo[57326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrxpgdhgajyondtcmjzolasfyqxxlkto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948357.8170352-244-44027154438860/AnsiballZ_file.py'
Nov 24 01:39:18 compute-0 sudo[57326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:18 compute-0 python3.9[57328]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:18 compute-0 sudo[57326]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:18 compute-0 sudo[57478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luppsikwcmytjbuziutbrwyyiwabandm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948358.4598448-252-271815654605276/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 24 01:39:18 compute-0 sudo[57478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:19 compute-0 python3.9[57480]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 24 01:39:19 compute-0 sudo[57478]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:19 compute-0 sudo[57630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evmcdwnfwxrhbouultfeeqmxqymgvdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948359.233315-261-70474094567088/AnsiballZ_file.py'
Nov 24 01:39:19 compute-0 sudo[57630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:19 compute-0 python3.9[57632]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:19 compute-0 sudo[57630]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:20 compute-0 sudo[57782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstmuzoyfbgkvdmwgruvhktnfyqmojad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948360.2302938-271-43246554714844/AnsiballZ_stat.py'
Nov 24 01:39:20 compute-0 sudo[57782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:20 compute-0 sudo[57782]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:21 compute-0 sudo[57905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eytcsbypdzvunzczdbumokolcugioxqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948360.2302938-271-43246554714844/AnsiballZ_copy.py'
Nov 24 01:39:21 compute-0 sudo[57905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:21 compute-0 sudo[57905]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:21 compute-0 sudo[58057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klmspjukfoangldbdvzcklhnhwjpubqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948361.5136194-286-188615417144567/AnsiballZ_slurp.py'
Nov 24 01:39:21 compute-0 sudo[58057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:22 compute-0 python3.9[58059]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 24 01:39:22 compute-0 sudo[58057]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:23 compute-0 sudo[58232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfdxfwspzykznqqpcyswqjaoqssuyvgr ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948362.4340103-295-157654592745294/async_wrapper.py j628661793950 300 /home/zuul/.ansible/tmp/ansible-tmp-1763948362.4340103-295-157654592745294/AnsiballZ_edpm_os_net_config.py _'
Nov 24 01:39:23 compute-0 sudo[58232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:23 compute-0 ansible-async_wrapper.py[58234]: Invoked with j628661793950 300 /home/zuul/.ansible/tmp/ansible-tmp-1763948362.4340103-295-157654592745294/AnsiballZ_edpm_os_net_config.py _
Nov 24 01:39:23 compute-0 ansible-async_wrapper.py[58237]: Starting module and watcher
Nov 24 01:39:23 compute-0 ansible-async_wrapper.py[58237]: Start watching 58238 (300)
Nov 24 01:39:23 compute-0 ansible-async_wrapper.py[58238]: Start module (58238)
Nov 24 01:39:23 compute-0 ansible-async_wrapper.py[58234]: Return async_wrapper task started.
Nov 24 01:39:23 compute-0 sudo[58232]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:23 compute-0 python3.9[58239]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 24 01:39:24 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 24 01:39:24 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 24 01:39:24 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 24 01:39:24 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 24 01:39:24 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3282] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3297] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3894] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3896] audit: op="connection-add" uuid="8d0d3b30-2061-4066-b190-93b9bddd44f4" name="br-ex-br" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3923] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3924] audit: op="connection-add" uuid="b107bec4-2987-4e6d-826a-072fb7bf8a53" name="br-ex-port" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3943] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3945] audit: op="connection-add" uuid="48df6a12-3267-4946-8439-5fb41c877051" name="eth1-port" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3963] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3965] audit: op="connection-add" uuid="cfe20344-9a93-42f0-a828-062d4083e9e9" name="vlan20-port" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3986] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.3988] audit: op="connection-add" uuid="f74dd4cb-2e92-4c67-9d79-927a2128fd51" name="vlan21-port" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4003] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4006] audit: op="connection-add" uuid="e8510b34-34a9-4649-86e2-371704b91501" name="vlan22-port" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4030] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,connection.autoconnect-priority,connection.timestamp" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4051] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4053] audit: op="connection-add" uuid="5f9c8c88-678a-4a50-bcef-754c2c9df757" name="br-ex-if" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4110] audit: op="connection-update" uuid="e2301243-2a8f-5270-a46e-0de358c9532a" name="ci-private-network" args="ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.method,ipv4.dns,ipv4.routing-rules,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routes,ipv6.method,ipv6.dns,ipv6.routing-rules,ovs-interface.type,ovs-external-ids.data,connection.master,connection.port-type,connection.timestamp,connection.slave-type,connection.controller" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4132] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4135] audit: op="connection-add" uuid="7e2e0e19-99de-4a8f-8125-30378deea6b8" name="vlan20-if" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4159] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4162] audit: op="connection-add" uuid="ef4157af-cc6a-47a6-bf75-8b169ad34662" name="vlan21-if" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4183] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4186] audit: op="connection-add" uuid="f2a3a09f-462c-4020-81c2-3e51e463b36f" name="vlan22-if" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4200] audit: op="connection-delete" uuid="798898cd-a7a8-3202-830a-832499ef243d" name="Wired connection 1" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4215] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4228] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4232] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (8d0d3b30-2061-4066-b190-93b9bddd44f4)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4233] audit: op="connection-activate" uuid="8d0d3b30-2061-4066-b190-93b9bddd44f4" name="br-ex-br" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4235] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4242] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4247] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b107bec4-2987-4e6d-826a-072fb7bf8a53)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4249] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4255] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4260] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (48df6a12-3267-4946-8439-5fb41c877051)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4263] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4270] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4276] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cfe20344-9a93-42f0-a828-062d4083e9e9)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4278] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4286] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4292] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f74dd4cb-2e92-4c67-9d79-927a2128fd51)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4294] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4302] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4309] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (e8510b34-34a9-4649-86e2-371704b91501)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4310] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4313] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4315] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4323] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4330] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4334] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (5f9c8c88-678a-4a50-bcef-754c2c9df757)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4335] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4340] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4343] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4345] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4347] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4361] device (eth1): disconnecting for new activation request.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4363] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4366] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4368] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4370] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4373] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4379] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4385] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7e2e0e19-99de-4a8f-8125-30378deea6b8)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4386] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4390] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4392] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4394] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4397] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4404] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4410] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ef4157af-cc6a-47a6-bf75-8b169ad34662)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4410] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4415] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4417] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4419] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4423] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4428] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4433] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (f2a3a09f-462c-4020-81c2-3e51e463b36f)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4434] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4438] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4440] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4442] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4444] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4461] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4463] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4467] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4469] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4477] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4483] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4489] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4494] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4496] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4502] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4510] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4516] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4519] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 systemd-udevd[58244]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:39:25 compute-0 kernel: Timeout policy base is empty
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4526] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4533] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4538] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4541] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4548] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4553] dhcp4 (eth0): canceled DHCP transaction
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4553] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4554] dhcp4 (eth0): state changed no lease
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4555] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4564] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4571] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58240 uid=0 result="fail" reason="Device is not activated"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4606] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4611] dhcp4 (eth0): state changed new lease, address=38.102.83.32
Nov 24 01:39:25 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4651] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4657] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4663] device (eth1): disconnecting for new activation request.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4664] audit: op="connection-activate" uuid="e2301243-2a8f-5270-a46e-0de358c9532a" name="ci-private-network" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4664] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4765] device (eth1): Activation: starting connection 'ci-private-network' (e2301243-2a8f-5270-a46e-0de358c9532a)
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4770] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4786] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4790] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4796] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4800] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4805] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4806] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58240 uid=0 result="success"
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4807] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4809] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4810] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4812] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 kernel: br-ex: entered promiscuous mode
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4814] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4820] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4824] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4826] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4829] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4832] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4834] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4837] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4840] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4845] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4850] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4857] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4861] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4908] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4911] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.4918] device (eth1): Activation: successful, device activated.
Nov 24 01:39:25 compute-0 kernel: vlan22: entered promiscuous mode
Nov 24 01:39:25 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5002] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 24 01:39:25 compute-0 systemd-udevd[58246]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5012] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5034] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5036] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5042] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 kernel: vlan21: entered promiscuous mode
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5131] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 24 01:39:25 compute-0 kernel: vlan20: entered promiscuous mode
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5167] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5172] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5206] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5213] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5215] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5223] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5234] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5236] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5243] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5258] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5273] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5312] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5314] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 01:39:25 compute-0 NetworkManager[55458]: <info>  [1763948365.5323] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 01:39:26 compute-0 NetworkManager[55458]: <info>  [1763948366.6546] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58240 uid=0 result="success"
Nov 24 01:39:26 compute-0 NetworkManager[55458]: <info>  [1763948366.7910] checkpoint[0x55b80f9df950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 24 01:39:26 compute-0 NetworkManager[55458]: <info>  [1763948366.7912] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58240 uid=0 result="success"
Nov 24 01:39:26 compute-0 sudo[58571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qexamxwwzuftxaqrppbphnlimqfglpsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948366.5045967-295-246182107960817/AnsiballZ_async_status.py'
Nov 24 01:39:26 compute-0 sudo[58571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.1253] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.1268] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 python3.9[58574]: ansible-ansible.legacy.async_status Invoked with jid=j628661793950.58234 mode=status _async_dir=/root/.ansible_async
Nov 24 01:39:27 compute-0 sudo[58571]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.3280] audit: op="networking-control" arg="global-dns-configuration" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.3315] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.3368] audit: op="networking-control" arg="global-dns-configuration" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.3396] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.4815] checkpoint[0x55b80f9dfa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 24 01:39:27 compute-0 NetworkManager[55458]: <info>  [1763948367.4820] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58240 uid=0 result="success"
Nov 24 01:39:27 compute-0 ansible-async_wrapper.py[58238]: Module complete (58238)
Nov 24 01:39:28 compute-0 ansible-async_wrapper.py[58237]: Done in kid B.
Nov 24 01:39:30 compute-0 sudo[58676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmnzxbdgjvdwmvvorajtbpzzsjphsyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948366.5045967-295-246182107960817/AnsiballZ_async_status.py'
Nov 24 01:39:30 compute-0 sudo[58676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:30 compute-0 python3.9[58678]: ansible-ansible.legacy.async_status Invoked with jid=j628661793950.58234 mode=status _async_dir=/root/.ansible_async
Nov 24 01:39:30 compute-0 sudo[58676]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:30 compute-0 sudo[58776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbjvihcqvsbvnwyawcrogxpsajrwhhpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948366.5045967-295-246182107960817/AnsiballZ_async_status.py'
Nov 24 01:39:30 compute-0 sudo[58776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:31 compute-0 python3.9[58778]: ansible-ansible.legacy.async_status Invoked with jid=j628661793950.58234 mode=cleanup _async_dir=/root/.ansible_async
Nov 24 01:39:31 compute-0 sudo[58776]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:31 compute-0 sudo[58928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmiefhbsqxxvjpkzjhcajbbkxwkacrue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948371.3581667-322-196723906010176/AnsiballZ_stat.py'
Nov 24 01:39:31 compute-0 sudo[58928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:31 compute-0 python3.9[58930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:39:31 compute-0 sudo[58928]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:32 compute-0 sudo[59051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxhujdvapmvgleenqumnbbfvkhmgdpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948371.3581667-322-196723906010176/AnsiballZ_copy.py'
Nov 24 01:39:32 compute-0 sudo[59051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:32 compute-0 python3.9[59053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948371.3581667-322-196723906010176/.source.returncode _original_basename=.8l1yk_kr follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:32 compute-0 sudo[59051]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:32 compute-0 sudo[59203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbmtoygyjwukrmsnzembsdlryjdmzvnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948372.5752635-338-150182306591796/AnsiballZ_stat.py'
Nov 24 01:39:32 compute-0 sudo[59203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:33 compute-0 python3.9[59205]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:39:33 compute-0 sudo[59203]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:33 compute-0 sudo[59326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feovnkjqlroxkgfeullzvjashngrofpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948372.5752635-338-150182306591796/AnsiballZ_copy.py'
Nov 24 01:39:33 compute-0 sudo[59326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:33 compute-0 python3.9[59328]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948372.5752635-338-150182306591796/.source.cfg _original_basename=.eoopbh1r follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:33 compute-0 sudo[59326]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:34 compute-0 sudo[59479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqhhrwrmrxllyizbalhpztrhuozggivq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948373.7896426-353-175739724218075/AnsiballZ_systemd.py'
Nov 24 01:39:34 compute-0 sudo[59479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:34 compute-0 python3.9[59481]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:39:34 compute-0 systemd[1]: Reloading Network Manager...
Nov 24 01:39:34 compute-0 NetworkManager[55458]: <info>  [1763948374.4682] audit: op="reload" arg="0" pid=59485 uid=0 result="success"
Nov 24 01:39:34 compute-0 NetworkManager[55458]: <info>  [1763948374.4691] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 24 01:39:34 compute-0 systemd[1]: Reloaded Network Manager.
Nov 24 01:39:34 compute-0 sudo[59479]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:34 compute-0 sshd-session[51466]: Connection closed by 192.168.122.30 port 35276
Nov 24 01:39:34 compute-0 sshd-session[51463]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:39:34 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 24 01:39:34 compute-0 systemd[1]: session-11.scope: Consumed 49.760s CPU time.
Nov 24 01:39:34 compute-0 systemd-logind[791]: Session 11 logged out. Waiting for processes to exit.
Nov 24 01:39:34 compute-0 systemd-logind[791]: Removed session 11.
Nov 24 01:39:36 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 01:39:40 compute-0 sshd-session[59517]: Accepted publickey for zuul from 192.168.122.30 port 36762 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:39:40 compute-0 systemd-logind[791]: New session 12 of user zuul.
Nov 24 01:39:40 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 24 01:39:40 compute-0 sshd-session[59517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:39:41 compute-0 python3.9[59671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:39:42 compute-0 python3.9[59825]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:39:43 compute-0 python3.9[60014]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:39:43 compute-0 sshd-session[59520]: Connection closed by 192.168.122.30 port 36762
Nov 24 01:39:43 compute-0 sshd-session[59517]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:39:43 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 24 01:39:43 compute-0 systemd[1]: session-12.scope: Consumed 2.508s CPU time.
Nov 24 01:39:43 compute-0 systemd-logind[791]: Session 12 logged out. Waiting for processes to exit.
Nov 24 01:39:43 compute-0 systemd-logind[791]: Removed session 12.
Nov 24 01:39:44 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 01:39:49 compute-0 sshd-session[60043]: Accepted publickey for zuul from 192.168.122.30 port 41818 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:39:49 compute-0 systemd-logind[791]: New session 13 of user zuul.
Nov 24 01:39:49 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 24 01:39:49 compute-0 sshd-session[60043]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:39:50 compute-0 python3.9[60197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:39:50 compute-0 python3.9[60351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:39:51 compute-0 sudo[60505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjrjfqjorqnubdigkecnlwjuoxyvrov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948391.4728084-40-3572476608489/AnsiballZ_setup.py'
Nov 24 01:39:51 compute-0 sudo[60505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:52 compute-0 python3.9[60507]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:39:52 compute-0 sudo[60505]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:52 compute-0 sudo[60589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcsxbjicwplmrxecoqjhmvzxkzbcxdix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948391.4728084-40-3572476608489/AnsiballZ_dnf.py'
Nov 24 01:39:52 compute-0 sudo[60589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:52 compute-0 python3.9[60591]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:39:54 compute-0 sudo[60589]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:54 compute-0 sudo[60743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpzdnzigqazdfkddldcnpkctlbhyslby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948394.311946-52-66303469340567/AnsiballZ_setup.py'
Nov 24 01:39:54 compute-0 sudo[60743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:54 compute-0 python3.9[60745]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:39:55 compute-0 sudo[60743]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:55 compute-0 sudo[60934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xncwsfgfxyqivmiiigqvbkrershcbbmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948395.4074333-63-95911043888668/AnsiballZ_file.py'
Nov 24 01:39:55 compute-0 sudo[60934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:56 compute-0 python3.9[60936]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:56 compute-0 sudo[60934]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:56 compute-0 sudo[61087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulqfaleywzqutujirkqsvrrqdspaisga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948396.3248718-71-111844245381646/AnsiballZ_command.py'
Nov 24 01:39:56 compute-0 sudo[61087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:56 compute-0 python3.9[61089]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:39:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:39:56 compute-0 sudo[61087]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:57 compute-0 sudo[61249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjqemmvarryfboefbxrdyxbzoydukxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948397.1445749-79-24738609000785/AnsiballZ_stat.py'
Nov 24 01:39:57 compute-0 sudo[61249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:57 compute-0 python3.9[61251]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:39:57 compute-0 sudo[61249]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:57 compute-0 sudo[61327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbdqrxufsfnyisjravfojkvdetxeapfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948397.1445749-79-24738609000785/AnsiballZ_file.py'
Nov 24 01:39:58 compute-0 sudo[61327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:58 compute-0 python3.9[61329]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:39:58 compute-0 sudo[61327]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:58 compute-0 sudo[61479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzusdjjkahfuvnsswsoddiywtcefjswt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948398.4022598-91-35340819591666/AnsiballZ_stat.py'
Nov 24 01:39:58 compute-0 sudo[61479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:58 compute-0 python3.9[61481]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:39:58 compute-0 sudo[61479]: pam_unix(sudo:session): session closed for user root
Nov 24 01:39:59 compute-0 sudo[61557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmelhhgivfvbtovjkwdfexpahetzldqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948398.4022598-91-35340819591666/AnsiballZ_file.py'
Nov 24 01:39:59 compute-0 sudo[61557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:39:59 compute-0 python3.9[61559]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:39:59 compute-0 sudo[61557]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:00 compute-0 sudo[61709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlixiehxnntxlkctmjbtmglrnjfawrrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948399.5695753-104-258716586429723/AnsiballZ_ini_file.py'
Nov 24 01:40:00 compute-0 sudo[61709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:00 compute-0 python3.9[61711]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:00 compute-0 sudo[61709]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:00 compute-0 sudo[61861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytxqezwezdeztbcifmqnxmrfinapvycm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948400.4341109-104-171680382770459/AnsiballZ_ini_file.py'
Nov 24 01:40:00 compute-0 sudo[61861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:00 compute-0 python3.9[61863]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:00 compute-0 sudo[61861]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:01 compute-0 sudo[62013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnihjobrxfjppaoxkxhgzsswwwkprnmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948401.1064727-104-91476636309841/AnsiballZ_ini_file.py'
Nov 24 01:40:01 compute-0 sudo[62013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:01 compute-0 python3.9[62015]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:01 compute-0 sudo[62013]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:02 compute-0 sudo[62165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjcnnpinbvsaqmncokgqdxqexrofgame ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948401.7757177-104-33359209067383/AnsiballZ_ini_file.py'
Nov 24 01:40:02 compute-0 sudo[62165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:02 compute-0 python3.9[62167]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:02 compute-0 sudo[62165]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:02 compute-0 sudo[62317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydeksmzvsdirqbxrpuzkybtofslnvesb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948402.5625167-135-36697950371151/AnsiballZ_dnf.py'
Nov 24 01:40:02 compute-0 sudo[62317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:03 compute-0 python3.9[62319]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:40:04 compute-0 sudo[62317]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:05 compute-0 sudo[62470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xagtusevgewbmqqayaxxlgdjrychmmbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948404.842498-146-192745636593835/AnsiballZ_setup.py'
Nov 24 01:40:05 compute-0 sudo[62470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:05 compute-0 python3.9[62472]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:40:05 compute-0 sudo[62470]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:05 compute-0 sudo[62624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvvaiprowfvqpahoiyihhxpqefvpfjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948405.671773-154-85048540681326/AnsiballZ_stat.py'
Nov 24 01:40:05 compute-0 sudo[62624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:06 compute-0 python3.9[62626]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:40:06 compute-0 sudo[62624]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:06 compute-0 sudo[62776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-typvdtfrlzrlnapemugdljuukkrqiabc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948406.313288-163-118440943175813/AnsiballZ_stat.py'
Nov 24 01:40:06 compute-0 sudo[62776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:06 compute-0 python3.9[62778]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:40:06 compute-0 sudo[62776]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:07 compute-0 sudo[62929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pozelvriozhbsiannngtelbkxbxjqion ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948407.0754716-173-162831386746971/AnsiballZ_command.py'
Nov 24 01:40:07 compute-0 sudo[62929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:07 compute-0 python3.9[62931]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:40:07 compute-0 sudo[62929]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:08 compute-0 sudo[63082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgmecsqwmtgmpdzrpmtbaoxtvhctoud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948407.8967698-183-154424926725763/AnsiballZ_service_facts.py'
Nov 24 01:40:08 compute-0 sudo[63082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:08 compute-0 python3.9[63084]: ansible-service_facts Invoked
Nov 24 01:40:08 compute-0 network[63101]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:40:08 compute-0 network[63102]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:40:08 compute-0 network[63103]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:40:12 compute-0 sudo[63082]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:13 compute-0 sudo[63386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvcrmnasvplbbfjvorealsfywszwfkpl ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763948413.3138592-198-119314703694167/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763948413.3138592-198-119314703694167/args'
Nov 24 01:40:13 compute-0 sudo[63386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:13 compute-0 sudo[63386]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:14 compute-0 sudo[63553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bytuypylhmckekfyxcsmsuvxxgzobiwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948413.9271848-209-186998519794746/AnsiballZ_dnf.py'
Nov 24 01:40:14 compute-0 sudo[63553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:14 compute-0 python3.9[63555]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:40:15 compute-0 sudo[63553]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:16 compute-0 sudo[63706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gydyxywdiuakivsdhfqftabcswezvibe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948416.1342115-222-109384183214365/AnsiballZ_package_facts.py'
Nov 24 01:40:16 compute-0 sudo[63706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:17 compute-0 python3.9[63708]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 24 01:40:17 compute-0 sudo[63706]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:18 compute-0 sudo[63858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfgtakydxzwvbokneglxlzofgmnzjtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948417.6428237-232-171597942772886/AnsiballZ_stat.py'
Nov 24 01:40:18 compute-0 sudo[63858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:18 compute-0 python3.9[63860]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:18 compute-0 sudo[63858]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:18 compute-0 sudo[63983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uigkgneodppypurknzftylawgwbckbby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948417.6428237-232-171597942772886/AnsiballZ_copy.py'
Nov 24 01:40:18 compute-0 sudo[63983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:19 compute-0 python3.9[63985]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948417.6428237-232-171597942772886/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:19 compute-0 sudo[63983]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:19 compute-0 sudo[64137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzcoebhryeagrmfdjziybzxcqpxuionv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948419.3028188-247-162588397985446/AnsiballZ_stat.py'
Nov 24 01:40:19 compute-0 sudo[64137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:19 compute-0 python3.9[64139]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:19 compute-0 sudo[64137]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:20 compute-0 sudo[64262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehwvnyldzzhitnvjoshytrhtzgmyoefu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948419.3028188-247-162588397985446/AnsiballZ_copy.py'
Nov 24 01:40:20 compute-0 sudo[64262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:21 compute-0 python3.9[64264]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948419.3028188-247-162588397985446/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:21 compute-0 sudo[64262]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:22 compute-0 sudo[64418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyxrhdotzsvbevycweppummqxjyuwtkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948421.5803976-268-45942110941370/AnsiballZ_lineinfile.py'
Nov 24 01:40:22 compute-0 sudo[64418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:22 compute-0 python3.9[64420]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:22 compute-0 sudo[64418]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:22 compute-0 sshd-session[64291]: Received disconnect from 46.188.119.26 port 59738:11: Bye Bye [preauth]
Nov 24 01:40:22 compute-0 sshd-session[64291]: Disconnected from authenticating user root 46.188.119.26 port 59738 [preauth]
Nov 24 01:40:23 compute-0 sudo[64572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lerwcwfryleflvuftjimzzljovuevccw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948422.7557797-283-159704599010684/AnsiballZ_setup.py'
Nov 24 01:40:23 compute-0 sudo[64572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:23 compute-0 python3.9[64574]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:40:23 compute-0 sudo[64572]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:24 compute-0 sudo[64656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qngkixdsibvspxgpbuklpwhozxliunvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948422.7557797-283-159704599010684/AnsiballZ_systemd.py'
Nov 24 01:40:24 compute-0 sudo[64656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:24 compute-0 python3.9[64658]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:24 compute-0 sudo[64656]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:25 compute-0 sudo[64810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhixdikllardzubptmvnnwqhktbdbdpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948425.043734-299-42928899528643/AnsiballZ_setup.py'
Nov 24 01:40:25 compute-0 sudo[64810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:25 compute-0 python3.9[64812]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:40:25 compute-0 sudo[64810]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:26 compute-0 sudo[64894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxqxudgtkjkwgjyiiwnfcnxmshbsdou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948425.043734-299-42928899528643/AnsiballZ_systemd.py'
Nov 24 01:40:26 compute-0 sudo[64894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:26 compute-0 python3.9[64896]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:40:26 compute-0 chronyd[794]: chronyd exiting
Nov 24 01:40:26 compute-0 systemd[1]: Stopping NTP client/server...
Nov 24 01:40:26 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 24 01:40:26 compute-0 systemd[1]: Stopped NTP client/server.
Nov 24 01:40:26 compute-0 systemd[1]: Starting NTP client/server...
Nov 24 01:40:26 compute-0 chronyd[64904]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 01:40:26 compute-0 chronyd[64904]: Frequency -28.060 +/- 0.895 ppm read from /var/lib/chrony/drift
Nov 24 01:40:26 compute-0 chronyd[64904]: Loaded seccomp filter (level 2)
Nov 24 01:40:26 compute-0 systemd[1]: Started NTP client/server.
Nov 24 01:40:26 compute-0 sudo[64894]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:26 compute-0 sshd-session[60046]: Connection closed by 192.168.122.30 port 41818
Nov 24 01:40:26 compute-0 sshd-session[60043]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:40:26 compute-0 systemd-logind[791]: Session 13 logged out. Waiting for processes to exit.
Nov 24 01:40:26 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 24 01:40:26 compute-0 systemd[1]: session-13.scope: Consumed 26.777s CPU time.
Nov 24 01:40:26 compute-0 systemd-logind[791]: Removed session 13.
Nov 24 01:40:32 compute-0 sshd-session[64930]: Accepted publickey for zuul from 192.168.122.30 port 32968 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:40:32 compute-0 systemd-logind[791]: New session 14 of user zuul.
Nov 24 01:40:32 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 24 01:40:32 compute-0 sshd-session[64930]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:40:33 compute-0 python3.9[65083]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:40:34 compute-0 sudo[65237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izkfabvgxyioxidemcbrmaluwmoyohda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948433.616586-33-246673418944716/AnsiballZ_file.py'
Nov 24 01:40:34 compute-0 sudo[65237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:34 compute-0 python3.9[65239]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:34 compute-0 sudo[65237]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:34 compute-0 sudo[65412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nabpljgvkkjubcozlzevcawbwndqmzzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948434.4284415-41-45795647554313/AnsiballZ_stat.py'
Nov 24 01:40:35 compute-0 sudo[65412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:35 compute-0 python3.9[65414]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:35 compute-0 sudo[65412]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:35 compute-0 sudo[65490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecorlaterlywhyinmfmyrmoljgbuegww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948434.4284415-41-45795647554313/AnsiballZ_file.py'
Nov 24 01:40:35 compute-0 sudo[65490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:35 compute-0 python3.9[65492]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.rm6yktqp recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:35 compute-0 sudo[65490]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:36 compute-0 sudo[65642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dufyxsfvthlowfipwpjsqdvgfbgnoqqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948436.0582366-61-269386747795813/AnsiballZ_stat.py'
Nov 24 01:40:36 compute-0 sudo[65642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:36 compute-0 python3.9[65644]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:36 compute-0 sudo[65642]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:37 compute-0 sudo[65765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llbfkjwmzzprvjhhjkkhdrplbugwvaey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948436.0582366-61-269386747795813/AnsiballZ_copy.py'
Nov 24 01:40:37 compute-0 sudo[65765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:37 compute-0 python3.9[65767]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948436.0582366-61-269386747795813/.source _original_basename=.qiavh832 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:37 compute-0 sudo[65765]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:37 compute-0 sudo[65917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riszjciotstrcuicsdoopmyinxjedbws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948437.4497159-77-109548854581948/AnsiballZ_file.py'
Nov 24 01:40:37 compute-0 sudo[65917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:37 compute-0 python3.9[65919]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:37 compute-0 sudo[65917]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:38 compute-0 sudo[66069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxtyldcgugrawmedamhiwlzplfnawhqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948438.1053762-85-224025792407090/AnsiballZ_stat.py'
Nov 24 01:40:38 compute-0 sudo[66069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:38 compute-0 python3.9[66071]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:38 compute-0 sudo[66069]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:38 compute-0 sudo[66192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uomlklqeiodkyorybklbdvdpuatxzfxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948438.1053762-85-224025792407090/AnsiballZ_copy.py'
Nov 24 01:40:38 compute-0 sudo[66192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:39 compute-0 python3.9[66194]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948438.1053762-85-224025792407090/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:39 compute-0 sudo[66192]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:39 compute-0 sudo[66344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twiougefpuaopaiaxkjkiqlywdbqkrfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948439.321233-85-160327849934328/AnsiballZ_stat.py'
Nov 24 01:40:39 compute-0 sudo[66344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:39 compute-0 python3.9[66346]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:39 compute-0 sudo[66344]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:40 compute-0 sudo[66467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogmylzbqbnbubdxupzxjjflwyokyuquu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948439.321233-85-160327849934328/AnsiballZ_copy.py'
Nov 24 01:40:40 compute-0 sudo[66467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:40 compute-0 python3.9[66469]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948439.321233-85-160327849934328/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:40:40 compute-0 sudo[66467]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:40 compute-0 sudo[66619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtbldxazwzfwaofhkwcktdfusivevezn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948440.6115246-114-75601474173723/AnsiballZ_file.py'
Nov 24 01:40:40 compute-0 sudo[66619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:41 compute-0 python3.9[66621]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:41 compute-0 sudo[66619]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:41 compute-0 sudo[66771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjsgysrugaelufcfdtwusbhfdrstiqsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948441.296468-122-141988148936506/AnsiballZ_stat.py'
Nov 24 01:40:41 compute-0 sudo[66771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:41 compute-0 python3.9[66773]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:41 compute-0 sudo[66771]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:42 compute-0 sudo[66894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seaslhirgrsbxtavxmwbjpfjypwxowvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948441.296468-122-141988148936506/AnsiballZ_copy.py'
Nov 24 01:40:42 compute-0 sudo[66894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:42 compute-0 python3.9[66896]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948441.296468-122-141988148936506/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:42 compute-0 sudo[66894]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:42 compute-0 sudo[67046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojryihdxzcioybuahefxpenatdhoyzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948442.65351-137-248131692588796/AnsiballZ_stat.py'
Nov 24 01:40:42 compute-0 sudo[67046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:43 compute-0 python3.9[67048]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:43 compute-0 sudo[67046]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:43 compute-0 sudo[67169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytanufjtewggzniumfbtyurcflsgjbzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948442.65351-137-248131692588796/AnsiballZ_copy.py'
Nov 24 01:40:43 compute-0 sudo[67169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:43 compute-0 python3.9[67171]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948442.65351-137-248131692588796/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:43 compute-0 sudo[67169]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:44 compute-0 sudo[67321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdopblprydwfpijtymydzdjtlywtejjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948443.9184713-152-232893616307417/AnsiballZ_systemd.py'
Nov 24 01:40:44 compute-0 sudo[67321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:44 compute-0 python3.9[67323]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:44 compute-0 systemd[1]: Reloading.
Nov 24 01:40:44 compute-0 systemd-rc-local-generator[67350]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:44 compute-0 systemd-sysv-generator[67353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:45 compute-0 systemd[1]: Reloading.
Nov 24 01:40:45 compute-0 systemd-rc-local-generator[67386]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:45 compute-0 systemd-sysv-generator[67390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:45 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 24 01:40:45 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 24 01:40:45 compute-0 sudo[67321]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:45 compute-0 sudo[67549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbjzodvemxvrqxjcjiogaruhirlfqxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948445.448283-160-49859864090643/AnsiballZ_stat.py'
Nov 24 01:40:45 compute-0 sudo[67549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:45 compute-0 python3.9[67551]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:45 compute-0 sudo[67549]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:46 compute-0 sudo[67672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caxibjlggnivqwyaltfvivthzytaauoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948445.448283-160-49859864090643/AnsiballZ_copy.py'
Nov 24 01:40:46 compute-0 sudo[67672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:46 compute-0 python3.9[67674]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948445.448283-160-49859864090643/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:46 compute-0 sudo[67672]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:46 compute-0 sudo[67824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvsmitplqztkffwiyzwrhvncpcvbwwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948446.6400685-175-263295088178256/AnsiballZ_stat.py'
Nov 24 01:40:46 compute-0 sudo[67824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:47 compute-0 python3.9[67826]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:47 compute-0 sudo[67824]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:47 compute-0 sudo[67947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrbxmwizertxtzeeyhscidsirjtdicqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948446.6400685-175-263295088178256/AnsiballZ_copy.py'
Nov 24 01:40:47 compute-0 sudo[67947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:47 compute-0 python3.9[67949]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948446.6400685-175-263295088178256/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:47 compute-0 sudo[67947]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:48 compute-0 sudo[68099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajcagchmaublkzbyuorqsybqcknhvtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948447.7362068-190-48560164484740/AnsiballZ_systemd.py'
Nov 24 01:40:48 compute-0 sudo[68099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:48 compute-0 python3.9[68101]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:48 compute-0 systemd[1]: Reloading.
Nov 24 01:40:48 compute-0 systemd-rc-local-generator[68130]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:48 compute-0 systemd-sysv-generator[68133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:48 compute-0 systemd[1]: Reloading.
Nov 24 01:40:48 compute-0 systemd-sysv-generator[68170]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:48 compute-0 systemd-rc-local-generator[68167]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:48 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 01:40:48 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 01:40:48 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 01:40:48 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 01:40:48 compute-0 sudo[68099]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:49 compute-0 python3.9[68328]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:40:49 compute-0 network[68345]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:40:49 compute-0 network[68346]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:40:49 compute-0 network[68347]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:40:52 compute-0 sudo[68607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywmncwyjpftrxsymmoadrkbfqbtexod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948452.6518683-206-51852425726281/AnsiballZ_systemd.py'
Nov 24 01:40:52 compute-0 sudo[68607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:53 compute-0 python3.9[68609]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:53 compute-0 systemd[1]: Reloading.
Nov 24 01:40:53 compute-0 systemd-sysv-generator[68638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:53 compute-0 systemd-rc-local-generator[68635]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:53 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 24 01:40:53 compute-0 iptables.init[68649]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 24 01:40:53 compute-0 iptables.init[68649]: iptables: Flushing firewall rules: [  OK  ]
Nov 24 01:40:53 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 24 01:40:53 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 24 01:40:53 compute-0 sudo[68607]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:54 compute-0 sudo[68843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbvahvaqxroitpxddttdzouqckysejup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948454.121223-206-148730811806333/AnsiballZ_systemd.py'
Nov 24 01:40:54 compute-0 sudo[68843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:54 compute-0 python3.9[68845]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:54 compute-0 sudo[68843]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:55 compute-0 sudo[68997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oayxrptzaaqfvplnqnzfwkfkhqgchykp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948454.9320407-222-266145182243178/AnsiballZ_systemd.py'
Nov 24 01:40:55 compute-0 sudo[68997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:55 compute-0 python3.9[68999]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:40:55 compute-0 systemd[1]: Reloading.
Nov 24 01:40:55 compute-0 systemd-sysv-generator[69033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:40:55 compute-0 systemd-rc-local-generator[69029]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:40:55 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 24 01:40:55 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 24 01:40:55 compute-0 sudo[68997]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:56 compute-0 sudo[69189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofdkfflfzppgzpkklgwnxbjkjkgrvlui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948456.0471406-230-198019459737784/AnsiballZ_command.py'
Nov 24 01:40:56 compute-0 sudo[69189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:56 compute-0 python3.9[69191]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:40:56 compute-0 sudo[69189]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:57 compute-0 sudo[69342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fizfracvkramdvpbxjsvnrzjgyfmuoii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948457.1913407-244-224154756461753/AnsiballZ_stat.py'
Nov 24 01:40:57 compute-0 sudo[69342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:57 compute-0 python3.9[69344]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:40:57 compute-0 sudo[69342]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:58 compute-0 sudo[69467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhatncbmvrdkhxkmduxzyucmroeopnqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948457.1913407-244-224154756461753/AnsiballZ_copy.py'
Nov 24 01:40:58 compute-0 sudo[69467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:58 compute-0 python3.9[69469]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948457.1913407-244-224154756461753/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:40:58 compute-0 sudo[69467]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:59 compute-0 sudo[69620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-famlhacpqxjkxihpqbfofcofbpfwkqrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948458.6268535-259-64305767816184/AnsiballZ_systemd.py'
Nov 24 01:40:59 compute-0 sudo[69620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:40:59 compute-0 python3.9[69622]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:40:59 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 24 01:40:59 compute-0 sshd[1005]: Received SIGHUP; restarting.
Nov 24 01:40:59 compute-0 sshd[1005]: Server listening on 0.0.0.0 port 22.
Nov 24 01:40:59 compute-0 sshd[1005]: Server listening on :: port 22.
Nov 24 01:40:59 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 24 01:40:59 compute-0 sudo[69620]: pam_unix(sudo:session): session closed for user root
Nov 24 01:40:59 compute-0 sudo[69776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sruazelsontjnyszyuxudpklwktsaytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948459.6413817-267-265093652781597/AnsiballZ_file.py'
Nov 24 01:40:59 compute-0 sudo[69776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:00 compute-0 python3.9[69778]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:00 compute-0 sudo[69776]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:00 compute-0 sudo[69928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrtxpotdyxiezzvxvbhyjkdxkleuqkeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948460.3813508-275-159147831862523/AnsiballZ_stat.py'
Nov 24 01:41:00 compute-0 sudo[69928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:00 compute-0 python3.9[69930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:00 compute-0 sudo[69928]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:01 compute-0 sudo[70051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpbniqzrllhwqbtnubxxrkhfspdfnkwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948460.3813508-275-159147831862523/AnsiballZ_copy.py'
Nov 24 01:41:01 compute-0 sudo[70051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:01 compute-0 python3.9[70053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948460.3813508-275-159147831862523/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:01 compute-0 sudo[70051]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:02 compute-0 sudo[70203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpmnobjioyigxpcmpptxtbwzlfpbmlkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948461.6780584-293-167050718259697/AnsiballZ_timezone.py'
Nov 24 01:41:02 compute-0 sudo[70203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:02 compute-0 python3.9[70205]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 01:41:02 compute-0 systemd[1]: Starting Time & Date Service...
Nov 24 01:41:02 compute-0 systemd[1]: Started Time & Date Service.
Nov 24 01:41:02 compute-0 sudo[70203]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:03 compute-0 sudo[70359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-widmzboohyusaoktjjztscfndmtycnzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948462.7138631-302-20900037201253/AnsiballZ_file.py'
Nov 24 01:41:03 compute-0 sudo[70359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:03 compute-0 python3.9[70361]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:03 compute-0 sudo[70359]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:03 compute-0 sudo[70511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cheljuxxbumkddowtulwiezieznofgrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948463.4230936-310-272613637360614/AnsiballZ_stat.py'
Nov 24 01:41:03 compute-0 sudo[70511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:03 compute-0 python3.9[70513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:03 compute-0 sudo[70511]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:04 compute-0 sudo[70634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdyzldmmzhffcgjcitfbvyfrjcxqzyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948463.4230936-310-272613637360614/AnsiballZ_copy.py'
Nov 24 01:41:04 compute-0 sudo[70634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:04 compute-0 python3.9[70636]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948463.4230936-310-272613637360614/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:04 compute-0 sudo[70634]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:04 compute-0 sudo[70786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulvzkwhyeelegixbwyxvurhyofcgwrpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948464.6691632-325-66728112883688/AnsiballZ_stat.py'
Nov 24 01:41:04 compute-0 sudo[70786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:05 compute-0 python3.9[70788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:05 compute-0 sudo[70786]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:05 compute-0 sudo[70909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doqzweqfpiuzkdananhybjyaficrgfwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948464.6691632-325-66728112883688/AnsiballZ_copy.py'
Nov 24 01:41:05 compute-0 sudo[70909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:05 compute-0 python3.9[70911]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948464.6691632-325-66728112883688/.source.yaml _original_basename=.tamengo_ follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:05 compute-0 sudo[70909]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:06 compute-0 sudo[71061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrygbksycrwpyfsqcyhantimqserxmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948465.8060617-340-4624587738168/AnsiballZ_stat.py'
Nov 24 01:41:06 compute-0 sudo[71061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:06 compute-0 python3.9[71063]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:06 compute-0 sudo[71061]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:06 compute-0 sudo[71184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sitnqxvqdbtwxbyojdyibjikdtquevkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948465.8060617-340-4624587738168/AnsiballZ_copy.py'
Nov 24 01:41:06 compute-0 sudo[71184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:06 compute-0 python3.9[71186]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948465.8060617-340-4624587738168/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:06 compute-0 sudo[71184]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:07 compute-0 sudo[71336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhhlufzjyefqcmlkhhorxlkajhwbamji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948467.0445824-355-117293654768377/AnsiballZ_command.py'
Nov 24 01:41:07 compute-0 sudo[71336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:07 compute-0 python3.9[71338]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:07 compute-0 sudo[71336]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:08 compute-0 sudo[71489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dufcddzefdukqdoxgiuukwlkhnhvojij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948467.7277446-363-37385189922487/AnsiballZ_command.py'
Nov 24 01:41:08 compute-0 sudo[71489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:08 compute-0 python3.9[71491]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:08 compute-0 sudo[71489]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:08 compute-0 sudo[71642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-febwhlupvdipolczroyrcoalmcsrsdgp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948468.3556342-371-187988370451130/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 01:41:08 compute-0 sudo[71642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:09 compute-0 python3[71644]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 01:41:09 compute-0 sudo[71642]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:09 compute-0 sudo[71794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfebxbpjxeqejlloxwzwstrggwjqoxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948469.2209797-379-173942957770036/AnsiballZ_stat.py'
Nov 24 01:41:09 compute-0 sudo[71794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:09 compute-0 python3.9[71796]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:09 compute-0 sudo[71794]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:10 compute-0 sudo[71917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnpwrjtcbxsaprhddqowmfvunwicjtzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948469.2209797-379-173942957770036/AnsiballZ_copy.py'
Nov 24 01:41:10 compute-0 sudo[71917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:10 compute-0 python3.9[71919]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948469.2209797-379-173942957770036/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:10 compute-0 sudo[71917]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:10 compute-0 sudo[72069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsybarnvtspuuccdoqlbjtxiogmcrnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948470.5393727-394-73192805458732/AnsiballZ_stat.py'
Nov 24 01:41:10 compute-0 sudo[72069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:11 compute-0 python3.9[72071]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:11 compute-0 sudo[72069]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:11 compute-0 sudo[72192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbpwscgbdqucfgdagrmybstcojzmebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948470.5393727-394-73192805458732/AnsiballZ_copy.py'
Nov 24 01:41:11 compute-0 sudo[72192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:11 compute-0 python3.9[72194]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948470.5393727-394-73192805458732/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:11 compute-0 sudo[72192]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:12 compute-0 sudo[72344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwpzgqrpsfcbfrzimpgnmwdgylbshqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948471.750651-409-174812269397896/AnsiballZ_stat.py'
Nov 24 01:41:12 compute-0 sudo[72344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:12 compute-0 python3.9[72346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:12 compute-0 sudo[72344]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:12 compute-0 sudo[72467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrfrugtwzvzpyxklkusjkgfbfyrovaae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948471.750651-409-174812269397896/AnsiballZ_copy.py'
Nov 24 01:41:12 compute-0 sudo[72467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:12 compute-0 python3.9[72469]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948471.750651-409-174812269397896/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:12 compute-0 sudo[72467]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:13 compute-0 sudo[72619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euansgwdepjimuklqxhtebaltyqkuwtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948472.9857693-424-278792282428577/AnsiballZ_stat.py'
Nov 24 01:41:13 compute-0 sudo[72619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:13 compute-0 python3.9[72621]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:13 compute-0 sudo[72619]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:13 compute-0 sudo[72742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsddiqlkmvtzcmxuyjmzkxugkzsksyxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948472.9857693-424-278792282428577/AnsiballZ_copy.py'
Nov 24 01:41:13 compute-0 sudo[72742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:14 compute-0 python3.9[72744]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948472.9857693-424-278792282428577/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:14 compute-0 sudo[72742]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:14 compute-0 sudo[72894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyekxasxsdujdmwwkcooojqwcbkogqzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948474.2158923-439-265116705288211/AnsiballZ_stat.py'
Nov 24 01:41:14 compute-0 sudo[72894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:14 compute-0 python3.9[72896]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:41:14 compute-0 sudo[72894]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:15 compute-0 sudo[73017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiizgqpoixxopqtmtgvcmnilgdyhlieq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948474.2158923-439-265116705288211/AnsiballZ_copy.py'
Nov 24 01:41:15 compute-0 sudo[73017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:15 compute-0 python3.9[73019]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948474.2158923-439-265116705288211/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:15 compute-0 sudo[73017]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:15 compute-0 sudo[73169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ythnyykenqygqqlzqyttjkhxljefraqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948475.377221-454-234238507683222/AnsiballZ_file.py'
Nov 24 01:41:15 compute-0 sudo[73169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:15 compute-0 python3.9[73171]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:15 compute-0 sudo[73169]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:16 compute-0 sudo[73321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzcmjpdisycinkotxklgvmdrbhifvefz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948476.0044212-462-179805962542648/AnsiballZ_command.py'
Nov 24 01:41:16 compute-0 sudo[73321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:16 compute-0 python3.9[73323]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:16 compute-0 sudo[73321]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:17 compute-0 sudo[73480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boxtgrotqhyneqwsihbqkzclwzczwatk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948476.760563-470-64531375817840/AnsiballZ_blockinfile.py'
Nov 24 01:41:17 compute-0 sudo[73480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:17 compute-0 python3.9[73482]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:17 compute-0 sudo[73480]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:17 compute-0 sudo[73633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuirrdmtqxnspbnactxumjvjnbbhczql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948477.7116477-479-4552617687333/AnsiballZ_file.py'
Nov 24 01:41:17 compute-0 sudo[73633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:18 compute-0 python3.9[73635]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:18 compute-0 sudo[73633]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:18 compute-0 sudo[73785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yptzaubhgpdcnvwxhzwclvbfzlwbkdxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948478.2811396-479-226046015953059/AnsiballZ_file.py'
Nov 24 01:41:18 compute-0 sudo[73785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:18 compute-0 python3.9[73787]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:18 compute-0 sudo[73785]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:19 compute-0 sudo[73937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xajtgghuqaefwvnodihkjzdxvforrqyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948479.1176696-494-5262863693303/AnsiballZ_mount.py'
Nov 24 01:41:19 compute-0 sudo[73937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:19 compute-0 python3.9[73939]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 01:41:19 compute-0 sudo[73937]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:19 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:41:20 compute-0 sudo[74091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhuhgxsqestrpxcfvmlcvxjqlwxspkqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948480.034026-494-113780318918264/AnsiballZ_mount.py'
Nov 24 01:41:20 compute-0 sudo[74091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:20 compute-0 python3.9[74093]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 01:41:20 compute-0 sudo[74091]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:20 compute-0 sshd-session[64933]: Connection closed by 192.168.122.30 port 32968
Nov 24 01:41:20 compute-0 sshd-session[64930]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:41:20 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 24 01:41:20 compute-0 systemd[1]: session-14.scope: Consumed 36.573s CPU time.
Nov 24 01:41:20 compute-0 systemd-logind[791]: Session 14 logged out. Waiting for processes to exit.
Nov 24 01:41:20 compute-0 systemd-logind[791]: Removed session 14.
Nov 24 01:41:26 compute-0 sshd-session[74119]: Accepted publickey for zuul from 192.168.122.30 port 55722 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:41:26 compute-0 systemd-logind[791]: New session 15 of user zuul.
Nov 24 01:41:26 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 24 01:41:26 compute-0 sshd-session[74119]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:41:26 compute-0 sudo[74272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utgrccllrbfskufmcjmambzuzqmwqgsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948486.4477665-16-153878649626535/AnsiballZ_tempfile.py'
Nov 24 01:41:26 compute-0 sudo[74272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:27 compute-0 python3.9[74274]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 24 01:41:27 compute-0 sudo[74272]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:27 compute-0 sudo[74424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aouozpgyzldjzlsehzjrcapavpudikbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948487.2960458-28-46608441048019/AnsiballZ_stat.py'
Nov 24 01:41:27 compute-0 sudo[74424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:27 compute-0 python3.9[74426]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:41:27 compute-0 sudo[74424]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:28 compute-0 sudo[74576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssqjkxoysobngjhztwxkvurxwwrgpyzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948488.0811877-38-37102515672856/AnsiballZ_setup.py'
Nov 24 01:41:28 compute-0 sudo[74576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:29 compute-0 python3.9[74578]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:41:29 compute-0 sudo[74576]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:29 compute-0 sudo[74728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmfczbkztyjfpkawlkzuxrniazmjwvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948489.4017224-47-246345213961826/AnsiballZ_blockinfile.py'
Nov 24 01:41:29 compute-0 sudo[74728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:30 compute-0 python3.9[74730]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvYSO3mRQD4yof4hctEF1U5hS379lF5O3FtWfw0YmLBnbAmnlQyt/9aBiyIjWu+dS2PG1kMZ8I6r0p+vcD33laPLsRg3/8zixz/myO+Z7+fsiufTDj0QF1sU5sBUDCh3cstGSfL2QSiATeHrBEgu8HRGbhFaiTs0dT7pkW4Lzong83bZXMqgHD54q+tmX/iqWI02VrTRzW2Wl7ziSE6T2iSQ7abHaM5LQ82t7LfwrD85oqeUt7ItyXUC1Ehn2oKornPz8v227Xjz5W9H+U0reV+PpvIj3GWvDFbA5Sr2XW0ss2aEAurlfmXG9iBrQ7jqKAqRolqlaWHwFOhl4489R1TctcV7hacKeeIp/jsx3rjKbyW+sOZ02gsjgjJwCDdEinALSGgY43itjf39rjagKVO8RmTzTcOKGTRU1fy17gcVEfpfbHxkji0anfz5vqwIeoYEXU/0VwCrDruYM7yXpzJxkflFep2lKmk0QXVqsrouwzikQTPaiEhbD0zjqyqFE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILZqa6kodhul7iyDhPjQ4DYoeD+mEo8pd0gJLWmHGEm+
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBNUC6IYcktwyC3jOJgyUtj1wjFThLrcF4veiMCKZ8QMhk1nZcA+DefkqXTTy9N71H6uE8N4Ovj1UknptmWWuTc=
                                             create=True mode=0644 path=/tmp/ansible.wjqa7b5w state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:30 compute-0 sudo[74728]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:30 compute-0 sudo[74880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djpzuiidtioujtzxjkmfpuyvkkucpwwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948490.174989-55-231456437413036/AnsiballZ_command.py'
Nov 24 01:41:30 compute-0 sudo[74880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:30 compute-0 python3.9[74882]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wjqa7b5w' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:30 compute-0 sudo[74880]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:31 compute-0 sudo[75034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuekvamiutlsstschrwnwnzgoqmpjatf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948491.0568871-63-111521366104379/AnsiballZ_file.py'
Nov 24 01:41:31 compute-0 sudo[75034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:31 compute-0 python3.9[75036]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.wjqa7b5w state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:31 compute-0 sudo[75034]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:31 compute-0 sshd-session[74122]: Connection closed by 192.168.122.30 port 55722
Nov 24 01:41:31 compute-0 sshd-session[74119]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:41:31 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 24 01:41:31 compute-0 systemd[1]: session-15.scope: Consumed 3.512s CPU time.
Nov 24 01:41:31 compute-0 systemd-logind[791]: Session 15 logged out. Waiting for processes to exit.
Nov 24 01:41:31 compute-0 systemd-logind[791]: Removed session 15.
Nov 24 01:41:32 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 01:41:36 compute-0 sshd-session[75064]: Invalid user elena from 46.188.119.26 port 60066
Nov 24 01:41:36 compute-0 sshd-session[75064]: Received disconnect from 46.188.119.26 port 60066:11: Bye Bye [preauth]
Nov 24 01:41:36 compute-0 sshd-session[75064]: Disconnected from invalid user elena 46.188.119.26 port 60066 [preauth]
Nov 24 01:41:37 compute-0 sshd-session[75066]: Accepted publickey for zuul from 192.168.122.30 port 38066 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:41:37 compute-0 systemd-logind[791]: New session 16 of user zuul.
Nov 24 01:41:37 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 24 01:41:37 compute-0 sshd-session[75066]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:41:38 compute-0 python3.9[75219]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:41:39 compute-0 sudo[75373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbasgzddnyrbmegkekjtvyxrmsiuimvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948499.0369372-32-43485268514775/AnsiballZ_systemd.py'
Nov 24 01:41:39 compute-0 sudo[75373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:39 compute-0 python3.9[75375]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 01:41:40 compute-0 sudo[75373]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:40 compute-0 sudo[75527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyqgabumovrychkbclzhoizsslkhrqqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948500.2796319-40-196975983150186/AnsiballZ_systemd.py'
Nov 24 01:41:40 compute-0 sudo[75527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:40 compute-0 python3.9[75529]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:41:40 compute-0 sudo[75527]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:41 compute-0 sudo[75680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudlgzbvvoytokyxnrmvvvfjimpupbof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948501.2019234-49-202795041149433/AnsiballZ_command.py'
Nov 24 01:41:41 compute-0 sudo[75680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:41 compute-0 python3.9[75682]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:41 compute-0 sudo[75680]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:42 compute-0 sudo[75833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rccswbpdmxbrlexewjoqivfxalvivmdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948502.0139105-57-68472805296296/AnsiballZ_stat.py'
Nov 24 01:41:42 compute-0 sudo[75833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:42 compute-0 python3.9[75835]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:41:42 compute-0 sudo[75833]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:43 compute-0 sudo[75987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbbyrsfokakfpfftzmaalecpiltqbqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948502.8484204-65-111797298771187/AnsiballZ_command.py'
Nov 24 01:41:43 compute-0 sudo[75987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:43 compute-0 python3.9[75989]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:43 compute-0 sudo[75987]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:44 compute-0 sudo[76142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssxytpshfvnlnmyiufbstvomxksvsulx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948503.625894-73-221625621549580/AnsiballZ_file.py'
Nov 24 01:41:44 compute-0 sudo[76142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:44 compute-0 python3.9[76144]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:41:44 compute-0 sudo[76142]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:44 compute-0 sshd-session[75069]: Connection closed by 192.168.122.30 port 38066
Nov 24 01:41:44 compute-0 sshd-session[75066]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:41:44 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 24 01:41:44 compute-0 systemd[1]: session-16.scope: Consumed 4.806s CPU time.
Nov 24 01:41:44 compute-0 systemd-logind[791]: Session 16 logged out. Waiting for processes to exit.
Nov 24 01:41:44 compute-0 systemd-logind[791]: Removed session 16.
Nov 24 01:41:49 compute-0 sshd-session[76169]: Accepted publickey for zuul from 192.168.122.30 port 44456 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:41:49 compute-0 systemd-logind[791]: New session 17 of user zuul.
Nov 24 01:41:49 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 24 01:41:49 compute-0 sshd-session[76169]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:41:50 compute-0 python3.9[76322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:41:51 compute-0 sudo[76476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgdnbhjjmyxkmvuyapdmnpllpbjliqxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948510.9649107-34-164088308303838/AnsiballZ_setup.py'
Nov 24 01:41:51 compute-0 sudo[76476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:51 compute-0 python3.9[76478]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:41:51 compute-0 sudo[76476]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:52 compute-0 sudo[76560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjbhwzxcpscsgeooxfyisowpvftwqgzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948510.9649107-34-164088308303838/AnsiballZ_dnf.py'
Nov 24 01:41:52 compute-0 sudo[76560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:41:52 compute-0 python3.9[76562]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 01:41:53 compute-0 sudo[76560]: pam_unix(sudo:session): session closed for user root
Nov 24 01:41:54 compute-0 python3.9[76713]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:41:55 compute-0 python3.9[76864]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:41:56 compute-0 python3.9[77014]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:41:57 compute-0 python3.9[77164]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:41:57 compute-0 sshd-session[76172]: Connection closed by 192.168.122.30 port 44456
Nov 24 01:41:57 compute-0 sshd-session[76169]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:41:57 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 24 01:41:57 compute-0 systemd[1]: session-17.scope: Consumed 5.866s CPU time.
Nov 24 01:41:57 compute-0 systemd-logind[791]: Session 17 logged out. Waiting for processes to exit.
Nov 24 01:41:57 compute-0 systemd-logind[791]: Removed session 17.
Nov 24 01:42:02 compute-0 sshd-session[77190]: Accepted publickey for zuul from 192.168.122.30 port 41584 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:42:02 compute-0 systemd-logind[791]: New session 18 of user zuul.
Nov 24 01:42:02 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 24 01:42:02 compute-0 sshd-session[77190]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:42:03 compute-0 python3.9[77343]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:42:05 compute-0 sudo[77497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fabxeyjueibtjiivbcewceaeulgoqzex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948524.830697-50-206188443496179/AnsiballZ_file.py'
Nov 24 01:42:05 compute-0 sudo[77497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:05 compute-0 python3.9[77499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:05 compute-0 sudo[77497]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:05 compute-0 sudo[77649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfthayuuktmwqjnwjyrpxwtbkurvwjya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948525.665216-50-266638385702724/AnsiballZ_file.py'
Nov 24 01:42:05 compute-0 sudo[77649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:06 compute-0 python3.9[77651]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:06 compute-0 sudo[77649]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:06 compute-0 sudo[77801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjonqmbvambfqxvlurcozwhqlgclmfpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948526.336048-65-250268462268807/AnsiballZ_stat.py'
Nov 24 01:42:06 compute-0 sudo[77801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:06 compute-0 python3.9[77803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:06 compute-0 sudo[77801]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:07 compute-0 sudo[77924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weprukqpqivncqzqobihnblooyzqbyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948526.336048-65-250268462268807/AnsiballZ_copy.py'
Nov 24 01:42:07 compute-0 sudo[77924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:07 compute-0 python3.9[77926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948526.336048-65-250268462268807/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=30f38ad2294464d45727dfaba261043e69682cb7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:07 compute-0 sudo[77924]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:08 compute-0 sudo[78076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkgoeqjzsaozjopixoqusmegkxfytdqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948527.942904-65-130683810897823/AnsiballZ_stat.py'
Nov 24 01:42:08 compute-0 sudo[78076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:08 compute-0 python3.9[78078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:08 compute-0 sudo[78076]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:08 compute-0 sudo[78199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txsryydogkxtwqwghszzyemtgyrxtzaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948527.942904-65-130683810897823/AnsiballZ_copy.py'
Nov 24 01:42:08 compute-0 sudo[78199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:09 compute-0 python3.9[78201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948527.942904-65-130683810897823/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=2a98811a4efe9ed92d38ab8b3130dbd402460dff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:09 compute-0 sudo[78199]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:09 compute-0 sudo[78351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmoqswmukwkdkqwbgtamxztkehpclwlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948529.2121882-65-241423061101287/AnsiballZ_stat.py'
Nov 24 01:42:09 compute-0 sudo[78351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:09 compute-0 python3.9[78353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:09 compute-0 sudo[78351]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:10 compute-0 sudo[78474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwbwxbkkruslyyckpqbbcshwcowkgzmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948529.2121882-65-241423061101287/AnsiballZ_copy.py'
Nov 24 01:42:10 compute-0 sudo[78474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:10 compute-0 python3.9[78476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948529.2121882-65-241423061101287/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3c25f6760d7621c6455d3431cf224afea2e20244 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:10 compute-0 sudo[78474]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:10 compute-0 sudo[78626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypflxeupfnalqnigujrnvvgktcqtmbfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948530.5006373-109-185359643229969/AnsiballZ_file.py'
Nov 24 01:42:10 compute-0 sudo[78626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:10 compute-0 python3.9[78628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:10 compute-0 sudo[78626]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:11 compute-0 sudo[78778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjxfcoqxrfbxeocwespqykarkzulqhql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948531.0978496-109-325894048823/AnsiballZ_file.py'
Nov 24 01:42:11 compute-0 sudo[78778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:11 compute-0 python3.9[78780]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:11 compute-0 sudo[78778]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:12 compute-0 sudo[78930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tztwvtacztfzdoypneymufynksaxfokz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948531.7468414-124-242106994545237/AnsiballZ_stat.py'
Nov 24 01:42:12 compute-0 sudo[78930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:12 compute-0 python3.9[78932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:12 compute-0 sudo[78930]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:12 compute-0 sudo[79053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkbbckrmnsuxlpkzglqxpbzzmfnpvose ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948531.7468414-124-242106994545237/AnsiballZ_copy.py'
Nov 24 01:42:12 compute-0 sudo[79053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:12 compute-0 python3.9[79055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948531.7468414-124-242106994545237/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=ffa06abe0f1727e6e445d9dc089763f26edb4ae3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:12 compute-0 sudo[79053]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:13 compute-0 sudo[79205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xussquqdhsfftlnvggwoegitowwwuqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948532.9295378-124-232093734911876/AnsiballZ_stat.py'
Nov 24 01:42:13 compute-0 sudo[79205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:13 compute-0 python3.9[79207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:13 compute-0 sudo[79205]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:13 compute-0 sudo[79328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxrmrpjbnqdybyblonfrpimdhgzsrprp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948532.9295378-124-232093734911876/AnsiballZ_copy.py'
Nov 24 01:42:13 compute-0 sudo[79328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:14 compute-0 python3.9[79330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948532.9295378-124-232093734911876/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=931d836bde2eebd6234146decd39a0e8c865c421 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:14 compute-0 sudo[79328]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:14 compute-0 sudo[79480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utgrwvlgrcvwzkdgmljronmidbtjayhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948534.1751714-124-257103504253703/AnsiballZ_stat.py'
Nov 24 01:42:14 compute-0 sudo[79480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:14 compute-0 python3.9[79482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:14 compute-0 sudo[79480]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:14 compute-0 sudo[79603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fatdrfhdghlxfobmlchrfrgxphvapucw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948534.1751714-124-257103504253703/AnsiballZ_copy.py'
Nov 24 01:42:14 compute-0 sudo[79603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:15 compute-0 python3.9[79605]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948534.1751714-124-257103504253703/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=83928f6e9eb51604265ef7099a117ad3fcce2c8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:15 compute-0 sudo[79603]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:15 compute-0 sudo[79755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aijzktntgjvnrjjmmczivzahdmfwosve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948535.4048603-168-206123159465607/AnsiballZ_file.py'
Nov 24 01:42:15 compute-0 sudo[79755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:15 compute-0 python3.9[79757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:15 compute-0 sudo[79755]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:16 compute-0 sudo[79907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abpjifwqshokxoyztfmpyprirpsdutnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948536.0466824-168-123967416940894/AnsiballZ_file.py'
Nov 24 01:42:16 compute-0 sudo[79907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:16 compute-0 python3.9[79909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:16 compute-0 sudo[79907]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:17 compute-0 sudo[80059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sttkfwvrmezccchdngwgkpznywfqrdej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948536.7723126-183-131587358360592/AnsiballZ_stat.py'
Nov 24 01:42:17 compute-0 sudo[80059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:17 compute-0 python3.9[80061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:17 compute-0 sudo[80059]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:17 compute-0 sudo[80182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thrhaacwdbkwgvytblidsarrlcrydszy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948536.7723126-183-131587358360592/AnsiballZ_copy.py'
Nov 24 01:42:17 compute-0 sudo[80182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:17 compute-0 python3.9[80184]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948536.7723126-183-131587358360592/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d0e21af5746e117eaf1726e6608c09a2c6902312 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:17 compute-0 sudo[80182]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:18 compute-0 sudo[80334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjkdzirqyzklhbxtegkcjrcehbtwwhho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948538.0265007-183-35631864641600/AnsiballZ_stat.py'
Nov 24 01:42:18 compute-0 sudo[80334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:18 compute-0 python3.9[80336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:18 compute-0 sudo[80334]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:19 compute-0 sudo[80457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erlzrbnawluwoizuwldofbwmmzrxsmch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948538.0265007-183-35631864641600/AnsiballZ_copy.py'
Nov 24 01:42:19 compute-0 sudo[80457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:19 compute-0 python3.9[80459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948538.0265007-183-35631864641600/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=0b1649bbdee1cf5262ed7e40df00014176154204 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:19 compute-0 sudo[80457]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:19 compute-0 sudo[80609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnyawuhzpbhdqdzhbfplvrkkrfhscdlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948539.3781762-183-231281860157406/AnsiballZ_stat.py'
Nov 24 01:42:19 compute-0 sudo[80609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:19 compute-0 python3.9[80611]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:19 compute-0 sudo[80609]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:20 compute-0 sudo[80733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcokkwjiliivwqrsajxpokpjhgvubfph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948539.3781762-183-231281860157406/AnsiballZ_copy.py'
Nov 24 01:42:20 compute-0 sudo[80733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:21 compute-0 python3.9[80735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948539.3781762-183-231281860157406/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b67cd55e50e2874a83d308d4ed28707718c4b9ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:21 compute-0 sudo[80733]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:21 compute-0 sudo[80885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikleyplexwbgahndrubajncjihulfslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948541.313532-227-159558058815008/AnsiballZ_file.py'
Nov 24 01:42:21 compute-0 sudo[80885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:21 compute-0 python3.9[80887]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:21 compute-0 sudo[80885]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:22 compute-0 sudo[81037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmaufcyjfyqhnrncbcakeycuvdltdmwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948542.6296957-227-93377023031580/AnsiballZ_file.py'
Nov 24 01:42:22 compute-0 sudo[81037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:23 compute-0 python3.9[81039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:23 compute-0 sudo[81037]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:23 compute-0 sudo[81189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfqznibqovjkwqhkwjtvwkqrqoscxmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948543.4073994-242-115141077498465/AnsiballZ_stat.py'
Nov 24 01:42:23 compute-0 sudo[81189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:23 compute-0 python3.9[81191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:23 compute-0 sudo[81189]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:24 compute-0 sudo[81312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hewoapnhbvcxfmgbdpdhevtxfcdslclz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948543.4073994-242-115141077498465/AnsiballZ_copy.py'
Nov 24 01:42:24 compute-0 sudo[81312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:24 compute-0 python3.9[81314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948543.4073994-242-115141077498465/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=27547859d8bb667d790b3ed31364ec96a2fe04c2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:24 compute-0 sudo[81312]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:24 compute-0 sudo[81464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trbbaqjzcabrvrsadjmrnrqpdpkqjgww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948544.7002666-242-25716470288362/AnsiballZ_stat.py'
Nov 24 01:42:24 compute-0 sudo[81464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:25 compute-0 python3.9[81466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:25 compute-0 sudo[81464]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:25 compute-0 sudo[81587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jardlwcmirrorcopevrfmmawqwpgkdlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948544.7002666-242-25716470288362/AnsiballZ_copy.py'
Nov 24 01:42:25 compute-0 sudo[81587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:25 compute-0 python3.9[81589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948544.7002666-242-25716470288362/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=0b1649bbdee1cf5262ed7e40df00014176154204 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:25 compute-0 sudo[81587]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:26 compute-0 sudo[81739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zciazbohuoncihuyfgegyilcabvjmrap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948545.904097-242-39093598026396/AnsiballZ_stat.py'
Nov 24 01:42:26 compute-0 sudo[81739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:26 compute-0 python3.9[81741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:26 compute-0 sudo[81739]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:26 compute-0 sudo[81862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvkvzaulvtbcdjbphgxcppzgrepbpzzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948545.904097-242-39093598026396/AnsiballZ_copy.py'
Nov 24 01:42:26 compute-0 sudo[81862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:27 compute-0 python3.9[81864]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948545.904097-242-39093598026396/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=69806db0303a210c61d6d0597b56112097f59a84 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:27 compute-0 sudo[81862]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:27 compute-0 sudo[82014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrqijkbbxuvboqpqxqbgubwlwlvpsitw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948547.7053795-302-177104493755801/AnsiballZ_file.py'
Nov 24 01:42:28 compute-0 sudo[82014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:28 compute-0 python3.9[82016]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:28 compute-0 sudo[82014]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:28 compute-0 sudo[82166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spsczwbadqtxglmhcbkezafelrchgpor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948548.3909905-310-17908200480852/AnsiballZ_stat.py'
Nov 24 01:42:28 compute-0 sudo[82166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:28 compute-0 python3.9[82168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:28 compute-0 sudo[82166]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:29 compute-0 sudo[82289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tudfuqsqhsugaruremzknihoyjnhfniq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948548.3909905-310-17908200480852/AnsiballZ_copy.py'
Nov 24 01:42:29 compute-0 sudo[82289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:29 compute-0 python3.9[82291]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948548.3909905-310-17908200480852/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:29 compute-0 sudo[82289]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:29 compute-0 sudo[82441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyxobwkppjtbihlzpxkmtlmsrujikidq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948549.6245267-326-111438424131243/AnsiballZ_file.py'
Nov 24 01:42:29 compute-0 sudo[82441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:30 compute-0 python3.9[82443]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:30 compute-0 sudo[82441]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:30 compute-0 sudo[82593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-typhaaockecksqhcifkhjyksznbxadjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948550.3679125-334-7930284531895/AnsiballZ_stat.py'
Nov 24 01:42:30 compute-0 sudo[82593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:30 compute-0 python3.9[82595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:30 compute-0 sudo[82593]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:31 compute-0 sudo[82716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfxiuyfwpnbwrizwukxqwgnoekoiqgcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948550.3679125-334-7930284531895/AnsiballZ_copy.py'
Nov 24 01:42:31 compute-0 sudo[82716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:31 compute-0 python3.9[82718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948550.3679125-334-7930284531895/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:31 compute-0 sudo[82716]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:31 compute-0 sudo[82868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnkaajopnymogdevvjhkurcoyjbgvmte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948551.6633523-350-69169822986443/AnsiballZ_file.py'
Nov 24 01:42:31 compute-0 sudo[82868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:32 compute-0 python3.9[82870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:32 compute-0 sudo[82868]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:32 compute-0 sudo[83020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkpzxifyimqijlzzzclebiyumniftba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948552.3026354-358-33815461594849/AnsiballZ_stat.py'
Nov 24 01:42:32 compute-0 sudo[83020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:32 compute-0 python3.9[83022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:32 compute-0 sudo[83020]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:33 compute-0 sudo[83143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zllmkjxteaodvbyjygbxwxvacikvilsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948552.3026354-358-33815461594849/AnsiballZ_copy.py'
Nov 24 01:42:33 compute-0 sudo[83143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:33 compute-0 python3.9[83145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948552.3026354-358-33815461594849/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:33 compute-0 sudo[83143]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:33 compute-0 sudo[83295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxkoplmeyppfvgodvgvtjassfxdqxpdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948553.5963213-374-51232584926343/AnsiballZ_file.py'
Nov 24 01:42:33 compute-0 sudo[83295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:34 compute-0 python3.9[83297]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:34 compute-0 sudo[83295]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:34 compute-0 sudo[83447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlqlbacqtuiprxrvgcifmyizuuymjqfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948554.257031-382-75429398266242/AnsiballZ_stat.py'
Nov 24 01:42:34 compute-0 sudo[83447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:34 compute-0 python3.9[83449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:34 compute-0 sudo[83447]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:35 compute-0 sudo[83570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqqioqdeqblkvzmyvnzxskffxyfjchzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948554.257031-382-75429398266242/AnsiballZ_copy.py'
Nov 24 01:42:35 compute-0 sudo[83570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:35 compute-0 chronyd[64904]: Selected source 23.133.168.244 (pool.ntp.org)
Nov 24 01:42:35 compute-0 python3.9[83572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948554.257031-382-75429398266242/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:35 compute-0 sudo[83570]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:35 compute-0 sudo[83722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylrythoiybrbrvyfoazfzsdzajbsndsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948555.69912-398-108962428885924/AnsiballZ_file.py'
Nov 24 01:42:36 compute-0 sudo[83722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:36 compute-0 python3.9[83724]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:36 compute-0 sudo[83722]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:36 compute-0 sudo[83874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djezsnfvsqneauignjzqyaiswcdrgkip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948556.4127574-406-227529849230772/AnsiballZ_stat.py'
Nov 24 01:42:36 compute-0 sudo[83874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:37 compute-0 python3.9[83876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:37 compute-0 sudo[83874]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:37 compute-0 sudo[83997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxsifgbpmjoemlxjwgfwsoovlrvayzcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948556.4127574-406-227529849230772/AnsiballZ_copy.py'
Nov 24 01:42:37 compute-0 sudo[83997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:37 compute-0 python3.9[83999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948556.4127574-406-227529849230772/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:37 compute-0 sudo[83997]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:38 compute-0 sudo[84149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzkypzwvekwqrdnbhsczmodvidjukfzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948557.942356-422-71590115917383/AnsiballZ_file.py'
Nov 24 01:42:38 compute-0 sudo[84149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:38 compute-0 python3.9[84151]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:38 compute-0 sudo[84149]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:38 compute-0 sudo[84301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnsuqvwuiqsovaracmemxljdgmumikhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948558.6820567-430-261030528126167/AnsiballZ_stat.py'
Nov 24 01:42:38 compute-0 sudo[84301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:39 compute-0 python3.9[84303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:39 compute-0 sudo[84301]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:39 compute-0 sudo[84424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqicckkmzearnpmispeuamwjzcnjdvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948558.6820567-430-261030528126167/AnsiballZ_copy.py'
Nov 24 01:42:39 compute-0 sudo[84424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:39 compute-0 python3.9[84426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948558.6820567-430-261030528126167/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:39 compute-0 sudo[84424]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:40 compute-0 sudo[84576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozdbbsowchzvgwlltlvyzfomuwdefrss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948559.9646707-446-12719230096753/AnsiballZ_file.py'
Nov 24 01:42:40 compute-0 sudo[84576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:40 compute-0 python3.9[84578]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:40 compute-0 sudo[84576]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:40 compute-0 sudo[84728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-digzlzvxvzcdrknilkyauleujolqeruy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948560.6889753-454-111090372655776/AnsiballZ_stat.py'
Nov 24 01:42:40 compute-0 sudo[84728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:41 compute-0 python3.9[84730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:42:41 compute-0 sudo[84728]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:41 compute-0 sudo[84851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvmtvkkcncsuulrtcmgmdmospyklifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948560.6889753-454-111090372655776/AnsiballZ_copy.py'
Nov 24 01:42:41 compute-0 sudo[84851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:41 compute-0 python3.9[84853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948560.6889753-454-111090372655776/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ab4114b2b61c095ea285e300962d5e84ecdc38f6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:41 compute-0 sudo[84851]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:42 compute-0 sshd-session[77193]: Connection closed by 192.168.122.30 port 41584
Nov 24 01:42:42 compute-0 sshd-session[77190]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:42:42 compute-0 systemd-logind[791]: Session 18 logged out. Waiting for processes to exit.
Nov 24 01:42:42 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 24 01:42:42 compute-0 systemd[1]: session-18.scope: Consumed 30.470s CPU time.
Nov 24 01:42:42 compute-0 systemd-logind[791]: Removed session 18.
Nov 24 01:42:47 compute-0 sshd-session[84879]: Accepted publickey for zuul from 192.168.122.30 port 60992 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:42:47 compute-0 systemd-logind[791]: New session 19 of user zuul.
Nov 24 01:42:47 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 24 01:42:47 compute-0 sshd-session[84879]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:42:48 compute-0 python3.9[85032]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:42:49 compute-0 sudo[85186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eekhkqtimsqtofsnsrldtcsxqthwdeyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948569.3135495-34-209452406923261/AnsiballZ_file.py'
Nov 24 01:42:49 compute-0 sudo[85186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:50 compute-0 python3.9[85188]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:50 compute-0 sudo[85186]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:50 compute-0 sudo[85338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzksqrhjewzolrluuunykbscfzpopdum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948570.2337718-34-256589961162764/AnsiballZ_file.py'
Nov 24 01:42:50 compute-0 sudo[85338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:50 compute-0 python3.9[85340]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:42:50 compute-0 sudo[85338]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:51 compute-0 python3.9[85490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:42:52 compute-0 sudo[85642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oylnqsasmdhtymxveyaorsovlwbmpgdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948571.6478086-57-48339789506771/AnsiballZ_seboolean.py'
Nov 24 01:42:52 compute-0 sudo[85642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:52 compute-0 python3.9[85644]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 01:42:52 compute-0 sshd-session[85567]: Received disconnect from 46.188.119.26 port 60390:11: Bye Bye [preauth]
Nov 24 01:42:52 compute-0 sshd-session[85567]: Disconnected from authenticating user root 46.188.119.26 port 60390 [preauth]
Nov 24 01:42:53 compute-0 sudo[85642]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:53 compute-0 sudo[85798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryzjotvudxazizcbllxkqoewmzwccqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948573.6846986-67-252354246343288/AnsiballZ_setup.py'
Nov 24 01:42:53 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 24 01:42:53 compute-0 sudo[85798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:54 compute-0 python3.9[85800]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:42:54 compute-0 sudo[85798]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:54 compute-0 sudo[85882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyhpnasftzotirbikebnxyutjpnplwxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948573.6846986-67-252354246343288/AnsiballZ_dnf.py'
Nov 24 01:42:54 compute-0 sudo[85882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:55 compute-0 python3.9[85884]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:42:56 compute-0 sudo[85882]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:57 compute-0 sudo[86035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbhucqiozddtpcdkdyuspgbrgrfnvsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948576.8863554-79-260124855681698/AnsiballZ_systemd.py'
Nov 24 01:42:57 compute-0 sudo[86035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:57 compute-0 python3.9[86037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:42:57 compute-0 sudo[86035]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:58 compute-0 sudo[86190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgezlmworkwztdkqylfvrhhireuthuib ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948578.1342876-87-229699035305071/AnsiballZ_edpm_nftables_snippet.py'
Nov 24 01:42:58 compute-0 sudo[86190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:58 compute-0 python3[86192]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 24 01:42:58 compute-0 sudo[86190]: pam_unix(sudo:session): session closed for user root
Nov 24 01:42:59 compute-0 sudo[86342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkaqkgokwfjyimztxbfcmloihafjtqia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948579.0423956-96-196642691345477/AnsiballZ_file.py'
Nov 24 01:42:59 compute-0 sudo[86342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:42:59 compute-0 python3.9[86344]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:42:59 compute-0 sudo[86342]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:00 compute-0 sudo[86494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmhfhabckuhskdsfigakcjywjpayenbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948579.7210555-104-155401855356382/AnsiballZ_stat.py'
Nov 24 01:43:00 compute-0 sudo[86494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:00 compute-0 python3.9[86496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:00 compute-0 sudo[86494]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:00 compute-0 sudo[86572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hutslfkmovlgpecoutxvfmdbsheybpqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948579.7210555-104-155401855356382/AnsiballZ_file.py'
Nov 24 01:43:00 compute-0 sudo[86572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:00 compute-0 python3.9[86574]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:00 compute-0 sudo[86572]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:01 compute-0 sudo[86724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzjrxagcpxiwxfmqrpczaaewpxtcxayg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948581.0983632-116-112068873035404/AnsiballZ_stat.py'
Nov 24 01:43:01 compute-0 sudo[86724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:01 compute-0 python3.9[86726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:01 compute-0 sudo[86724]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:01 compute-0 sudo[86802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijvwymadhlpticgexogtamotgpxzduvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948581.0983632-116-112068873035404/AnsiballZ_file.py'
Nov 24 01:43:01 compute-0 sudo[86802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:02 compute-0 python3.9[86804]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.euxztvp8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:02 compute-0 sudo[86802]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:02 compute-0 sudo[86954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfasbtgfkojykgflyjueosemhsndqrip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948582.2675471-128-24381291191488/AnsiballZ_stat.py'
Nov 24 01:43:02 compute-0 sudo[86954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:02 compute-0 python3.9[86956]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:02 compute-0 sudo[86954]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:03 compute-0 sudo[87032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trnmshohlbcwbvytfbpqaqixihmjycqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948582.2675471-128-24381291191488/AnsiballZ_file.py'
Nov 24 01:43:03 compute-0 sudo[87032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:03 compute-0 python3.9[87034]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:03 compute-0 sudo[87032]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:04 compute-0 sudo[87184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebjmwrpbxndmiayojwlxkaknxryazvgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948583.5825086-141-69381378218717/AnsiballZ_command.py'
Nov 24 01:43:04 compute-0 sudo[87184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:04 compute-0 python3.9[87186]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:04 compute-0 sudo[87184]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:04 compute-0 sudo[87337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuyfdgloorachvzkaijntdymqvnpcpqe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948584.4824831-149-257348553742416/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 01:43:04 compute-0 sudo[87337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:05 compute-0 python3[87339]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 01:43:05 compute-0 sudo[87337]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:05 compute-0 sudo[87489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltwyqpoyayyfnujwvwagilcitxbzxlni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948585.3917184-157-268177151143150/AnsiballZ_stat.py'
Nov 24 01:43:05 compute-0 sudo[87489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:05 compute-0 python3.9[87491]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:05 compute-0 sudo[87489]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:06 compute-0 sudo[87614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axjkyrvpmnwpldcfgcqcbyelwhggamkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948585.3917184-157-268177151143150/AnsiballZ_copy.py'
Nov 24 01:43:06 compute-0 sudo[87614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:06 compute-0 python3.9[87616]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948585.3917184-157-268177151143150/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:06 compute-0 sudo[87614]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:07 compute-0 sudo[87766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdrqciaozwccmhnnjryuwryyrkqnhtde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948586.983008-172-279714125011063/AnsiballZ_stat.py'
Nov 24 01:43:07 compute-0 sudo[87766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:07 compute-0 python3.9[87768]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:07 compute-0 sudo[87766]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:07 compute-0 sudo[87891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdsdxintkksdazruhotcnbzwrisvqoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948586.983008-172-279714125011063/AnsiballZ_copy.py'
Nov 24 01:43:07 compute-0 sudo[87891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:08 compute-0 python3.9[87893]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948586.983008-172-279714125011063/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:08 compute-0 sudo[87891]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:08 compute-0 sudo[88043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydkocilvhitbyirinmlopyytlqzcamyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948588.30471-187-45139272884966/AnsiballZ_stat.py'
Nov 24 01:43:08 compute-0 sudo[88043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:08 compute-0 python3.9[88045]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:08 compute-0 sudo[88043]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:09 compute-0 sudo[88168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bppdswweggecokdrfkcywehlmgxernwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948588.30471-187-45139272884966/AnsiballZ_copy.py'
Nov 24 01:43:09 compute-0 sudo[88168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:09 compute-0 python3.9[88170]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948588.30471-187-45139272884966/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:09 compute-0 sudo[88168]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:10 compute-0 sudo[88320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljmupsldpbrdmdeosuwnuiswgnlczqcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948589.713028-202-250784529458039/AnsiballZ_stat.py'
Nov 24 01:43:10 compute-0 sudo[88320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:10 compute-0 python3.9[88322]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:10 compute-0 sudo[88320]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:10 compute-0 sudo[88445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iudlcyyosjlwrnrbmzdskncywvipigpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948589.713028-202-250784529458039/AnsiballZ_copy.py'
Nov 24 01:43:10 compute-0 sudo[88445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:11 compute-0 python3.9[88447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948589.713028-202-250784529458039/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:11 compute-0 sudo[88445]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:11 compute-0 sudo[88597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbvzanrxnrddcqzjkjwxpyvvzaettmcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948591.2275028-217-118691563067744/AnsiballZ_stat.py'
Nov 24 01:43:11 compute-0 sudo[88597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:11 compute-0 python3.9[88599]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:11 compute-0 sudo[88597]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:12 compute-0 sudo[88722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egeywvnaldhuteyqkpuneiuqnoitiykp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948591.2275028-217-118691563067744/AnsiballZ_copy.py'
Nov 24 01:43:12 compute-0 sudo[88722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:12 compute-0 python3.9[88724]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948591.2275028-217-118691563067744/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:12 compute-0 sudo[88722]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:13 compute-0 sudo[88874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlruuwdnqrqwbaxeuodjkdodkvcacupt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948592.8247883-232-189304587487570/AnsiballZ_file.py'
Nov 24 01:43:13 compute-0 sudo[88874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:13 compute-0 python3.9[88876]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:13 compute-0 sudo[88874]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:13 compute-0 sudo[89026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvpcaycbmrrmvzytnraydrzcyfphntzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948593.5258896-240-79512575476214/AnsiballZ_command.py'
Nov 24 01:43:13 compute-0 sudo[89026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:14 compute-0 python3.9[89028]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:14 compute-0 sudo[89026]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:14 compute-0 sudo[89181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myapzdkjivkohpvezkjrspgrwnugkzip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948594.2367427-248-235543461760626/AnsiballZ_blockinfile.py'
Nov 24 01:43:14 compute-0 sudo[89181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:14 compute-0 python3.9[89183]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:14 compute-0 sudo[89181]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:15 compute-0 sudo[89333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpazxuwyfaubaweygqkrlzslvvhnrsuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948595.1017141-257-277666564339320/AnsiballZ_command.py'
Nov 24 01:43:15 compute-0 sudo[89333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:15 compute-0 python3.9[89335]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:15 compute-0 sudo[89333]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:16 compute-0 sudo[89486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfpozxarncjbccjnedmmdftfacodkqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948595.886635-265-65900951449329/AnsiballZ_stat.py'
Nov 24 01:43:16 compute-0 sudo[89486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:16 compute-0 python3.9[89488]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:43:16 compute-0 sudo[89486]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:16 compute-0 sudo[89640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcmnewxcduokwuqsozqulgnnsmvzsdpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948596.5264883-273-141574846290994/AnsiballZ_command.py'
Nov 24 01:43:16 compute-0 sudo[89640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:16 compute-0 python3.9[89642]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:17 compute-0 sudo[89640]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:17 compute-0 sudo[89795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyrqidyqrmjrvklhbzhffalobfdtawek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948597.1838825-281-278362910987005/AnsiballZ_file.py'
Nov 24 01:43:17 compute-0 sudo[89795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:17 compute-0 python3.9[89797]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:17 compute-0 sudo[89795]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:18 compute-0 python3.9[89947]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:43:19 compute-0 sudo[90098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrufautlwghelzvxulhxnigkysebtpdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948599.32946-321-200185484123622/AnsiballZ_command.py'
Nov 24 01:43:19 compute-0 sudo[90098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:19 compute-0 python3.9[90100]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:19 compute-0 ovs-vsctl[90101]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 24 01:43:19 compute-0 sudo[90098]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:20 compute-0 sudo[90251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svincgbrvioebvvbckezqyohezbxjsxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948599.9990528-330-45514176867824/AnsiballZ_command.py'
Nov 24 01:43:20 compute-0 sudo[90251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:20 compute-0 python3.9[90253]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:20 compute-0 sudo[90251]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:21 compute-0 sudo[90406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccldwhjhwpgvdjftrvercaepzggfhpqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948601.35533-338-143181645326330/AnsiballZ_command.py'
Nov 24 01:43:21 compute-0 sudo[90406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:21 compute-0 python3.9[90408]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:21 compute-0 ovs-vsctl[90409]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 24 01:43:21 compute-0 sudo[90406]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:22 compute-0 python3.9[90559]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:43:23 compute-0 sudo[90711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ischlzyztskjbvqvtubfgldycpvbphjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948603.3587773-355-98170783592795/AnsiballZ_file.py'
Nov 24 01:43:23 compute-0 sudo[90711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:23 compute-0 python3.9[90713]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:23 compute-0 sudo[90711]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:24 compute-0 sudo[90863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvvhysvanexkoxwynocmputyrxfxmwqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948604.0169685-363-139847827054619/AnsiballZ_stat.py'
Nov 24 01:43:24 compute-0 sudo[90863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:24 compute-0 python3.9[90865]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:24 compute-0 sudo[90863]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:24 compute-0 sudo[90941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbbcntriwrazvdsuajmlaondbdmiimoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948604.0169685-363-139847827054619/AnsiballZ_file.py'
Nov 24 01:43:24 compute-0 sudo[90941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:25 compute-0 python3.9[90943]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:25 compute-0 sudo[90941]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:25 compute-0 sudo[91093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvvpowrgmhqrqrniuladsioduvscpbst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948605.2274134-363-79675302328063/AnsiballZ_stat.py'
Nov 24 01:43:25 compute-0 sudo[91093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:25 compute-0 python3.9[91095]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:25 compute-0 sudo[91093]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:25 compute-0 sudo[91171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhhdoxtmbmyeijfgwjbutqirsrnrgcmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948605.2274134-363-79675302328063/AnsiballZ_file.py'
Nov 24 01:43:25 compute-0 sudo[91171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:26 compute-0 python3.9[91173]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:26 compute-0 sudo[91171]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:26 compute-0 sudo[91323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sajqdqzeariiqfxqwafkfpkeaotmikaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948606.3904407-386-199788616469418/AnsiballZ_file.py'
Nov 24 01:43:26 compute-0 sudo[91323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:26 compute-0 python3.9[91325]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:26 compute-0 sudo[91323]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:27 compute-0 sudo[91475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwfsxjwcilvymhftogatttxtevtdxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948607.051115-394-46388567210724/AnsiballZ_stat.py'
Nov 24 01:43:27 compute-0 sudo[91475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:27 compute-0 python3.9[91477]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:27 compute-0 sudo[91475]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:27 compute-0 sudo[91553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccvxvindylelnihihymywburwrpziyvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948607.051115-394-46388567210724/AnsiballZ_file.py'
Nov 24 01:43:27 compute-0 sudo[91553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:28 compute-0 python3.9[91555]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:28 compute-0 sudo[91553]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:28 compute-0 sudo[91705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqkocsduryetybkolktkuviqfajrnapq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948608.3113337-406-159039551759030/AnsiballZ_stat.py'
Nov 24 01:43:28 compute-0 sudo[91705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:28 compute-0 python3.9[91707]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:28 compute-0 sudo[91705]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:28 compute-0 sudo[91783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynnofhymodnikfllpbjkpzsvrknmfdhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948608.3113337-406-159039551759030/AnsiballZ_file.py'
Nov 24 01:43:28 compute-0 sudo[91783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:29 compute-0 python3.9[91785]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:29 compute-0 sudo[91783]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:29 compute-0 sudo[91935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekvwdsdntmtryayezjjvtlwyiusfjamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948609.3930273-418-152933866710485/AnsiballZ_systemd.py'
Nov 24 01:43:29 compute-0 sudo[91935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:30 compute-0 python3.9[91937]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:43:30 compute-0 systemd[1]: Reloading.
Nov 24 01:43:30 compute-0 systemd-sysv-generator[91967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:43:30 compute-0 systemd-rc-local-generator[91964]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:43:30 compute-0 sudo[91935]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:30 compute-0 sudo[92124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sshmkbwaadpkmsynfzrnlazzugqhbkxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948610.6143987-426-198566699969695/AnsiballZ_stat.py'
Nov 24 01:43:30 compute-0 sudo[92124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:31 compute-0 python3.9[92126]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:31 compute-0 sudo[92124]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:31 compute-0 sudo[92202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmriwqvpmwpjquartlluklsyvrjfsxbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948610.6143987-426-198566699969695/AnsiballZ_file.py'
Nov 24 01:43:31 compute-0 sudo[92202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:31 compute-0 python3.9[92204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:31 compute-0 sudo[92202]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:32 compute-0 sudo[92354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwntpzuqvwqxhioqvjqnzxvfnwuuszbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948611.7617762-438-154443778363834/AnsiballZ_stat.py'
Nov 24 01:43:32 compute-0 sudo[92354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:32 compute-0 python3.9[92356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:32 compute-0 sudo[92354]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:32 compute-0 sudo[92432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eletjlydxckbcrzwooaffdhjjedvmale ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948611.7617762-438-154443778363834/AnsiballZ_file.py'
Nov 24 01:43:32 compute-0 sudo[92432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:32 compute-0 python3.9[92434]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:32 compute-0 sudo[92432]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:33 compute-0 sudo[92584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmletmiglcvigknpgwnrqoijqkjjqiii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948613.0336788-450-117480669882894/AnsiballZ_systemd.py'
Nov 24 01:43:33 compute-0 sudo[92584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:33 compute-0 python3.9[92586]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:43:33 compute-0 systemd[1]: Reloading.
Nov 24 01:43:33 compute-0 systemd-rc-local-generator[92612]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:43:33 compute-0 systemd-sysv-generator[92616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:43:33 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 01:43:33 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 01:43:33 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 01:43:33 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 01:43:34 compute-0 sudo[92584]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:34 compute-0 sudo[92777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuxuhxtqrryshhgxeryttyqcagtgfmig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948614.3060765-460-215684723721360/AnsiballZ_file.py'
Nov 24 01:43:34 compute-0 sudo[92777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:34 compute-0 python3.9[92779]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:35 compute-0 sudo[92777]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:35 compute-0 sudo[92929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zurjaolzpwlcctcfcponfczrcyurmkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948615.1953993-468-28724725268718/AnsiballZ_stat.py'
Nov 24 01:43:35 compute-0 sudo[92929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:35 compute-0 python3.9[92931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:35 compute-0 sudo[92929]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:36 compute-0 sudo[93052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cocvnhxtmlbbgsybvnkfpdhqwfodmczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948615.1953993-468-28724725268718/AnsiballZ_copy.py'
Nov 24 01:43:36 compute-0 sudo[93052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:36 compute-0 python3.9[93054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948615.1953993-468-28724725268718/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:36 compute-0 sudo[93052]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:37 compute-0 sudo[93204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lughgchesdejhrikbjhqknmoevhagors ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948616.6546714-485-248928305377764/AnsiballZ_file.py'
Nov 24 01:43:37 compute-0 sudo[93204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:37 compute-0 python3.9[93206]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:43:37 compute-0 sudo[93204]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:37 compute-0 sudo[93356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrswjalzmdlzxzaapgwhobhixsuwzexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948617.4587827-493-109043735858528/AnsiballZ_stat.py'
Nov 24 01:43:37 compute-0 sudo[93356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:37 compute-0 python3.9[93358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:43:37 compute-0 sudo[93356]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:38 compute-0 sudo[93479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxydmfccwttraxtsfstorlwwpoonbbpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948617.4587827-493-109043735858528/AnsiballZ_copy.py'
Nov 24 01:43:38 compute-0 sudo[93479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:38 compute-0 python3.9[93481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948617.4587827-493-109043735858528/.source.json _original_basename=.4hedtkrh follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:38 compute-0 sudo[93479]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:39 compute-0 sudo[93631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmarijspfdizyucjbxlvvmhaehqhyljt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948618.9454677-508-19355982412385/AnsiballZ_file.py'
Nov 24 01:43:39 compute-0 sudo[93631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:39 compute-0 python3.9[93633]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:39 compute-0 sudo[93631]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:39 compute-0 sudo[93783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxmlhjajqypblconpyaiwtpurabssdrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948619.6308234-516-143672333896995/AnsiballZ_stat.py'
Nov 24 01:43:39 compute-0 sudo[93783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:40 compute-0 sudo[93783]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:40 compute-0 sudo[93906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qugujojadvmkhxzehhcdqucofyhgnrqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948619.6308234-516-143672333896995/AnsiballZ_copy.py'
Nov 24 01:43:40 compute-0 sudo[93906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:40 compute-0 sudo[93906]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:41 compute-0 sudo[94058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjhfrfprhhuyglntroowumlnwfzupiqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948621.0566626-533-71528836188688/AnsiballZ_container_config_data.py'
Nov 24 01:43:41 compute-0 sudo[94058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:41 compute-0 python3.9[94060]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 24 01:43:41 compute-0 sudo[94058]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:42 compute-0 sudo[94210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wexdjmpdhvivnpjuppfmfnnqyvsrgmdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948621.9749777-542-270246681902005/AnsiballZ_container_config_hash.py'
Nov 24 01:43:42 compute-0 sudo[94210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:42 compute-0 python3.9[94212]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:43:42 compute-0 sudo[94210]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:43 compute-0 sudo[94362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjyagehsiysdymkgfrgpwlkfuokuzgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948622.8815818-551-159908768054382/AnsiballZ_podman_container_info.py'
Nov 24 01:43:43 compute-0 sudo[94362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:43 compute-0 python3.9[94364]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 01:43:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:43:43 compute-0 sudo[94362]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:44 compute-0 sudo[94526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heacnlalhngkfzluhjbdgsejqhwtdvbg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948624.142373-564-120534605214350/AnsiballZ_edpm_container_manage.py'
Nov 24 01:43:44 compute-0 sudo[94526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:44 compute-0 python3[94528]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:43:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:43:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:43:45 compute-0 podman[94564]: 2025-11-24 01:43:45.126539868 +0000 UTC m=+0.032268564 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 01:43:45 compute-0 podman[94564]: 2025-11-24 01:43:45.300990923 +0000 UTC m=+0.206719599 container create c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 01:43:45 compute-0 python3[94528]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 01:43:45 compute-0 sudo[94526]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:45 compute-0 sudo[94752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bocgshamdyxukcfasuyecjbuecjvqvjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948625.6261823-572-141323542730284/AnsiballZ_stat.py'
Nov 24 01:43:45 compute-0 sudo[94752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 01:43:46 compute-0 python3.9[94754]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:43:46 compute-0 sudo[94752]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:46 compute-0 sudo[94906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmpbamlnebqimnresvmvejytwucdcxhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948626.3241317-581-31252332267031/AnsiballZ_file.py'
Nov 24 01:43:46 compute-0 sudo[94906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:46 compute-0 python3.9[94908]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:46 compute-0 sudo[94906]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:47 compute-0 sudo[94982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crzvpjetueanlfmetcimtpplgwhqajaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948626.3241317-581-31252332267031/AnsiballZ_stat.py'
Nov 24 01:43:47 compute-0 sudo[94982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:47 compute-0 python3.9[94984]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:43:47 compute-0 sudo[94982]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:47 compute-0 sudo[95133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsvkssiwfyvyxojkcqaofwzzsywnsvpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948627.2717783-581-195567453334888/AnsiballZ_copy.py'
Nov 24 01:43:47 compute-0 sudo[95133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:47 compute-0 python3.9[95135]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763948627.2717783-581-195567453334888/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:43:47 compute-0 sudo[95133]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:48 compute-0 sudo[95209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxpdnlymueqzptqhqjgqmzzrjgzhjtvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948627.2717783-581-195567453334888/AnsiballZ_systemd.py'
Nov 24 01:43:48 compute-0 sudo[95209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:48 compute-0 python3.9[95211]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:43:48 compute-0 systemd[1]: Reloading.
Nov 24 01:43:48 compute-0 systemd-sysv-generator[95241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:43:48 compute-0 systemd-rc-local-generator[95237]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:43:48 compute-0 sudo[95209]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:49 compute-0 sudo[95320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcndydvdnnoyrgfjjqdghsrlyygthikw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948627.2717783-581-195567453334888/AnsiballZ_systemd.py'
Nov 24 01:43:49 compute-0 sudo[95320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:49 compute-0 python3.9[95322]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:43:49 compute-0 systemd[1]: Reloading.
Nov 24 01:43:49 compute-0 systemd-sysv-generator[95356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:43:49 compute-0 systemd-rc-local-generator[95353]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:43:49 compute-0 systemd[1]: Starting ovn_controller container...
Nov 24 01:43:49 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 24 01:43:49 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:43:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cb490c296a917986b66dfef57b597e90eb6c75eef6ee76c597ff8a3926b97b4/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 01:43:49 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4.
Nov 24 01:43:49 compute-0 podman[95364]: 2025-11-24 01:43:49.913430619 +0000 UTC m=+0.182880780 container init c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 24 01:43:49 compute-0 ovn_controller[95380]: + sudo -E kolla_set_configs
Nov 24 01:43:49 compute-0 podman[95364]: 2025-11-24 01:43:49.947929363 +0000 UTC m=+0.217379464 container start c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 01:43:49 compute-0 edpm-start-podman-container[95364]: ovn_controller
Nov 24 01:43:49 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 24 01:43:50 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 24 01:43:50 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 24 01:43:50 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 24 01:43:50 compute-0 systemd[95416]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 24 01:43:50 compute-0 edpm-start-podman-container[95363]: Creating additional drop-in dependency for "ovn_controller" (c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4)
Nov 24 01:43:50 compute-0 podman[95386]: 2025-11-24 01:43:50.069945374 +0000 UTC m=+0.103180026 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 24 01:43:50 compute-0 systemd[1]: c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4-1850ba4d1246933.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:43:50 compute-0 systemd[1]: c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4-1850ba4d1246933.service: Failed with result 'exit-code'.
Nov 24 01:43:50 compute-0 systemd[1]: Reloading.
Nov 24 01:43:50 compute-0 systemd-rc-local-generator[95467]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:43:50 compute-0 systemd-sysv-generator[95470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:43:50 compute-0 systemd[95416]: Queued start job for default target Main User Target.
Nov 24 01:43:50 compute-0 systemd[95416]: Created slice User Application Slice.
Nov 24 01:43:50 compute-0 systemd[95416]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 24 01:43:50 compute-0 systemd[95416]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 01:43:50 compute-0 systemd[95416]: Reached target Paths.
Nov 24 01:43:50 compute-0 systemd[95416]: Reached target Timers.
Nov 24 01:43:50 compute-0 systemd[95416]: Starting D-Bus User Message Bus Socket...
Nov 24 01:43:50 compute-0 systemd[95416]: Starting Create User's Volatile Files and Directories...
Nov 24 01:43:50 compute-0 systemd[95416]: Listening on D-Bus User Message Bus Socket.
Nov 24 01:43:50 compute-0 systemd[95416]: Finished Create User's Volatile Files and Directories.
Nov 24 01:43:50 compute-0 systemd[95416]: Reached target Sockets.
Nov 24 01:43:50 compute-0 systemd[95416]: Reached target Basic System.
Nov 24 01:43:50 compute-0 systemd[95416]: Reached target Main User Target.
Nov 24 01:43:50 compute-0 systemd[95416]: Startup finished in 156ms.
Nov 24 01:43:50 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 24 01:43:50 compute-0 systemd[1]: Started ovn_controller container.
Nov 24 01:43:50 compute-0 systemd[1]: Started Session c1 of User root.
Nov 24 01:43:50 compute-0 sudo[95320]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:50 compute-0 ovn_controller[95380]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:43:50 compute-0 ovn_controller[95380]: INFO:__main__:Validating config file
Nov 24 01:43:50 compute-0 ovn_controller[95380]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:43:50 compute-0 ovn_controller[95380]: INFO:__main__:Writing out command to execute
Nov 24 01:43:50 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: ++ cat /run_command
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + ARGS=
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + sudo kolla_copy_cacerts
Nov 24 01:43:50 compute-0 systemd[1]: Started Session c2 of User root.
Nov 24 01:43:50 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + [[ ! -n '' ]]
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + . kolla_extend_start
Nov 24 01:43:50 compute-0 ovn_controller[95380]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + umask 0022
Nov 24 01:43:50 compute-0 ovn_controller[95380]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5182] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5188] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5197] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5202] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5204] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 01:43:50 compute-0 kernel: br-int: entered promiscuous mode
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 01:43:50 compute-0 ovn_controller[95380]: 2025-11-24T01:43:50Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5398] manager: (ovn-fbcb26-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 24 01:43:50 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5581] device (genev_sys_6081): carrier: link connected
Nov 24 01:43:50 compute-0 systemd-udevd[95540]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:43:50 compute-0 NetworkManager[55458]: <info>  [1763948630.5587] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 24 01:43:50 compute-0 systemd-udevd[95542]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:43:50 compute-0 sudo[95644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vugtwljutdduyaefyjcssvojsfzxzthq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948630.543007-609-219197555939703/AnsiballZ_command.py'
Nov 24 01:43:50 compute-0 sudo[95644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:50 compute-0 python3.9[95646]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:50 compute-0 ovs-vsctl[95647]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 24 01:43:51 compute-0 sudo[95644]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:51 compute-0 sudo[95797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zisemwhattdykdmvbtvhdxkdtjvebxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948631.1950855-617-235964464811175/AnsiballZ_command.py'
Nov 24 01:43:51 compute-0 sudo[95797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:51 compute-0 python3.9[95799]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:51 compute-0 ovs-vsctl[95801]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 24 01:43:51 compute-0 sudo[95797]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:52 compute-0 sudo[95952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhbhoqxnlmbzrllntcnwjzhslgsjbxtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948632.093436-631-45334202168175/AnsiballZ_command.py'
Nov 24 01:43:52 compute-0 sudo[95952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:43:52 compute-0 python3.9[95954]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:43:52 compute-0 ovs-vsctl[95955]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 24 01:43:52 compute-0 sudo[95952]: pam_unix(sudo:session): session closed for user root
Nov 24 01:43:53 compute-0 sshd-session[84882]: Connection closed by 192.168.122.30 port 60992
Nov 24 01:43:53 compute-0 sshd-session[84879]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:43:53 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 24 01:43:53 compute-0 systemd[1]: session-19.scope: Consumed 48.404s CPU time.
Nov 24 01:43:53 compute-0 systemd-logind[791]: Session 19 logged out. Waiting for processes to exit.
Nov 24 01:43:53 compute-0 systemd-logind[791]: Removed session 19.
Nov 24 01:43:58 compute-0 sshd-session[95980]: Accepted publickey for zuul from 192.168.122.30 port 60260 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:43:58 compute-0 systemd-logind[791]: New session 21 of user zuul.
Nov 24 01:43:58 compute-0 systemd[1]: Started Session 21 of User zuul.
Nov 24 01:43:58 compute-0 sshd-session[95980]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:43:59 compute-0 python3.9[96133]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:44:00 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 24 01:44:00 compute-0 systemd[95416]: Activating special unit Exit the Session...
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped target Main User Target.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped target Basic System.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped target Paths.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped target Sockets.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped target Timers.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 01:44:00 compute-0 systemd[95416]: Closed D-Bus User Message Bus Socket.
Nov 24 01:44:00 compute-0 systemd[95416]: Stopped Create User's Volatile Files and Directories.
Nov 24 01:44:00 compute-0 systemd[95416]: Removed slice User Application Slice.
Nov 24 01:44:00 compute-0 systemd[95416]: Reached target Shutdown.
Nov 24 01:44:00 compute-0 systemd[95416]: Finished Exit the Session.
Nov 24 01:44:00 compute-0 systemd[95416]: Reached target Exit the Session.
Nov 24 01:44:00 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 24 01:44:00 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 24 01:44:00 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 24 01:44:00 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 24 01:44:00 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 24 01:44:00 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 24 01:44:00 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 24 01:44:00 compute-0 sudo[96289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sigqndgwcxkochzfnguktgzbopyotjad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948640.2013855-34-83994301390922/AnsiballZ_file.py'
Nov 24 01:44:00 compute-0 sudo[96289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:00 compute-0 python3.9[96291]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:00 compute-0 sudo[96289]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:01 compute-0 sudo[96441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-detltsicjilflkuxooebluaikvddyesw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948641.055888-34-82989583728260/AnsiballZ_file.py'
Nov 24 01:44:01 compute-0 sudo[96441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:01 compute-0 python3.9[96443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:01 compute-0 sudo[96441]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:02 compute-0 sudo[96593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjvxcndwnsdhdfwzgtsjwpsbabyxaeaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948641.7376661-34-167618599019903/AnsiballZ_file.py'
Nov 24 01:44:02 compute-0 sudo[96593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:02 compute-0 python3.9[96595]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:02 compute-0 sudo[96593]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:02 compute-0 sudo[96745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqwgfbvcdnexhpmfrhfgqsccgueivci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948642.381591-34-212567988337894/AnsiballZ_file.py'
Nov 24 01:44:02 compute-0 sudo[96745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:02 compute-0 python3.9[96747]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:02 compute-0 sudo[96745]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:03 compute-0 sudo[96897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-denpbmrmfeeaygnvxuryantkqbjmhswe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948643.0908852-34-252909678650545/AnsiballZ_file.py'
Nov 24 01:44:03 compute-0 sudo[96897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:03 compute-0 python3.9[96899]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:03 compute-0 sudo[96897]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:04 compute-0 python3.9[97049]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:44:04 compute-0 sudo[97199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnwfauoqlcnvmsfiyodnyiglawtvfynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948644.3936522-78-172571258885874/AnsiballZ_seboolean.py'
Nov 24 01:44:04 compute-0 sudo[97199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:05 compute-0 python3.9[97201]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 01:44:05 compute-0 sudo[97199]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:06 compute-0 python3.9[97351]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:07 compute-0 python3.9[97472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948645.8381982-86-134358772644268/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:07 compute-0 python3.9[97624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:08 compute-0 sshd-session[97520]: Invalid user techuser from 46.188.119.26 port 60726
Nov 24 01:44:08 compute-0 sshd-session[97520]: Received disconnect from 46.188.119.26 port 60726:11: Bye Bye [preauth]
Nov 24 01:44:08 compute-0 sshd-session[97520]: Disconnected from invalid user techuser 46.188.119.26 port 60726 [preauth]
Nov 24 01:44:08 compute-0 python3.9[97745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948647.4901102-101-256371508470960/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:09 compute-0 sudo[97896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvyttnzvkfhevfowvpicynmhnamgnnak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948648.814064-118-189289287156182/AnsiballZ_setup.py'
Nov 24 01:44:09 compute-0 sudo[97896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:09 compute-0 python3.9[97898]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:44:09 compute-0 sudo[97896]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:10 compute-0 sudo[97980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozwxlgdvukopuafptkzmxpginishwfms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948648.814064-118-189289287156182/AnsiballZ_dnf.py'
Nov 24 01:44:10 compute-0 sudo[97980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:10 compute-0 python3.9[97982]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:44:11 compute-0 sudo[97980]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:12 compute-0 sudo[98133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grpvqdkcybgwvqiehxgzjsriaegqkvoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948651.9629645-130-12420228356633/AnsiballZ_systemd.py'
Nov 24 01:44:12 compute-0 sudo[98133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:12 compute-0 python3.9[98135]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:44:12 compute-0 sudo[98133]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:13 compute-0 python3.9[98288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:13 compute-0 python3.9[98409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948653.073868-138-181929728610586/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:14 compute-0 python3.9[98559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:15 compute-0 python3.9[98680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948654.1422281-138-219990467040537/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:16 compute-0 python3.9[98830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:17 compute-0 python3.9[98951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948656.0937028-182-74585516793294/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:17 compute-0 python3.9[99101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:18 compute-0 python3.9[99222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948657.2308183-182-154742702455693/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:19 compute-0 python3.9[99372]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:44:20 compute-0 sudo[99525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdceczxlnwoeadnsezrcyvdbkfhzhrid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948659.4090629-220-133645206971865/AnsiballZ_file.py'
Nov 24 01:44:20 compute-0 sudo[99525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:20 compute-0 ovn_controller[95380]: 2025-11-24T01:44:20Z|00025|memory|INFO|16384 kB peak resident set size after 30.4 seconds
Nov 24 01:44:20 compute-0 ovn_controller[95380]: 2025-11-24T01:44:20Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Nov 24 01:44:21 compute-0 podman[99514]: 2025-11-24 01:44:21.013569206 +0000 UTC m=+0.154417387 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 01:44:21 compute-0 python3.9[99532]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:21 compute-0 sudo[99525]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:21 compute-0 sudo[99699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoytyccufassoamzussopcqgukxzbdqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948661.233193-228-253197166104158/AnsiballZ_stat.py'
Nov 24 01:44:21 compute-0 sudo[99699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:21 compute-0 python3.9[99701]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:21 compute-0 sudo[99699]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:22 compute-0 sudo[99777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmbdaagtaexleeoioodaekgttspuyax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948661.233193-228-253197166104158/AnsiballZ_file.py'
Nov 24 01:44:22 compute-0 sudo[99777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:22 compute-0 python3.9[99779]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:22 compute-0 sudo[99777]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:22 compute-0 sudo[99929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oghcwlrcymepaltdalijdjqjzuttzbkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948662.4401536-228-239819908463692/AnsiballZ_stat.py'
Nov 24 01:44:22 compute-0 sudo[99929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:22 compute-0 python3.9[99931]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:23 compute-0 sudo[99929]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:23 compute-0 sudo[100007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnxisqfztpvzhjxogdumbgmhgxhgbkaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948662.4401536-228-239819908463692/AnsiballZ_file.py'
Nov 24 01:44:23 compute-0 sudo[100007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:23 compute-0 python3.9[100009]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:23 compute-0 sudo[100007]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:23 compute-0 sudo[100159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oorcjfneosgzoaxakegmptsqsdjizdoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948663.6435132-251-14240052117938/AnsiballZ_file.py'
Nov 24 01:44:23 compute-0 sudo[100159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:24 compute-0 python3.9[100161]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:24 compute-0 sudo[100159]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:24 compute-0 sudo[100311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajytojqpiytvwlbfbdmlzhicuafxhfwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948664.3531144-259-10102022152475/AnsiballZ_stat.py'
Nov 24 01:44:24 compute-0 sudo[100311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:24 compute-0 python3.9[100313]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:24 compute-0 sudo[100311]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:25 compute-0 sudo[100389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqziigdhxmrjknuvmavzhtsalcorvzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948664.3531144-259-10102022152475/AnsiballZ_file.py'
Nov 24 01:44:25 compute-0 sudo[100389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:25 compute-0 python3.9[100391]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:25 compute-0 sudo[100389]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:25 compute-0 sudo[100541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inowadvhyuewwreksspddunbzbocqlik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948665.5419772-271-110424017042184/AnsiballZ_stat.py'
Nov 24 01:44:25 compute-0 sudo[100541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:26 compute-0 python3.9[100543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:26 compute-0 sudo[100541]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:26 compute-0 sudo[100619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrwytqanatfdnagjuixrnmhqwdjdtnyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948665.5419772-271-110424017042184/AnsiballZ_file.py'
Nov 24 01:44:26 compute-0 sudo[100619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:26 compute-0 python3.9[100621]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:26 compute-0 sudo[100619]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:26 compute-0 sudo[100771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxzqyiwutikwhxtngwouturzbhowcmly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948666.6678398-283-183199881508972/AnsiballZ_systemd.py'
Nov 24 01:44:26 compute-0 sudo[100771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:27 compute-0 python3.9[100773]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:44:27 compute-0 systemd[1]: Reloading.
Nov 24 01:44:27 compute-0 systemd-sysv-generator[100801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:27 compute-0 systemd-rc-local-generator[100798]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:27 compute-0 sudo[100771]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:28 compute-0 sudo[100959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmevbcltkycqqhyqjqochvtlvufzmdsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948667.7315037-291-230184172958973/AnsiballZ_stat.py'
Nov 24 01:44:28 compute-0 sudo[100959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:28 compute-0 python3.9[100961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:28 compute-0 sudo[100959]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:28 compute-0 sudo[101037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbmtryryvxbjkltpayzvbxptpianrhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948667.7315037-291-230184172958973/AnsiballZ_file.py'
Nov 24 01:44:28 compute-0 sudo[101037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:28 compute-0 python3.9[101039]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:28 compute-0 sudo[101037]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:29 compute-0 sudo[101189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reibnbwkigswufjtzxamuavkqybdxeip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948668.883878-303-215477506773391/AnsiballZ_stat.py'
Nov 24 01:44:29 compute-0 sudo[101189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:29 compute-0 python3.9[101191]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:29 compute-0 sudo[101189]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:29 compute-0 sudo[101267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fypbmzngwuzyeuoqxoefvblkwmmyxuai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948668.883878-303-215477506773391/AnsiballZ_file.py'
Nov 24 01:44:29 compute-0 sudo[101267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:29 compute-0 python3.9[101269]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:29 compute-0 sudo[101267]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:30 compute-0 sudo[101419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uskezekstpytpnbzadusnhubazwodwpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948670.0519834-315-30115827938800/AnsiballZ_systemd.py'
Nov 24 01:44:30 compute-0 sudo[101419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:30 compute-0 python3.9[101421]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:44:30 compute-0 systemd[1]: Reloading.
Nov 24 01:44:30 compute-0 systemd-rc-local-generator[101452]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:30 compute-0 systemd-sysv-generator[101456]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:30 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 01:44:30 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 01:44:30 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 01:44:30 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 01:44:31 compute-0 sudo[101419]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:31 compute-0 sudo[101614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmdqadnzrvrrhbelhppduwawbqjkpytm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948671.272338-325-276922074947753/AnsiballZ_file.py'
Nov 24 01:44:31 compute-0 sudo[101614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:31 compute-0 python3.9[101616]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:31 compute-0 sudo[101614]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:32 compute-0 sudo[101766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axkomxvaxwynarracmmlclntsclwrbtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948672.0264492-333-222911245631005/AnsiballZ_stat.py'
Nov 24 01:44:32 compute-0 sudo[101766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:32 compute-0 python3.9[101768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:32 compute-0 sudo[101766]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:32 compute-0 sudo[101889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjuojmtavjdiuwaognqxrmrneyqbkvye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948672.0264492-333-222911245631005/AnsiballZ_copy.py'
Nov 24 01:44:32 compute-0 sudo[101889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:33 compute-0 python3.9[101891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763948672.0264492-333-222911245631005/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:33 compute-0 sudo[101889]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:33 compute-0 sudo[102041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogcabdzfmudobtbtvarkstyxhyjyhcng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948673.4079285-350-12592326412291/AnsiballZ_file.py'
Nov 24 01:44:33 compute-0 sudo[102041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:33 compute-0 python3.9[102043]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:44:33 compute-0 sudo[102041]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:34 compute-0 sudo[102193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhqxowdcmvfmyugrqmqffckbyirbpbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948674.1464665-358-194067449491248/AnsiballZ_stat.py'
Nov 24 01:44:34 compute-0 sudo[102193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:34 compute-0 python3.9[102195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:44:34 compute-0 sudo[102193]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:35 compute-0 sudo[102316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etcduphtnzafuogbskwlgcddnqhlvhcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948674.1464665-358-194067449491248/AnsiballZ_copy.py'
Nov 24 01:44:35 compute-0 sudo[102316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:35 compute-0 python3.9[102318]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948674.1464665-358-194067449491248/.source.json _original_basename=.8_azfdi_ follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:35 compute-0 sudo[102316]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:35 compute-0 sudo[102468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmxewfbgrekhndiyjtcczwuohaqagtkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948675.4984345-373-268905280721196/AnsiballZ_file.py'
Nov 24 01:44:35 compute-0 sudo[102468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:35 compute-0 python3.9[102470]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:36 compute-0 sudo[102468]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:36 compute-0 sudo[102620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrqokqkxugkrjvazyhmabrrbhrfzqfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948676.194596-381-51838023185486/AnsiballZ_stat.py'
Nov 24 01:44:36 compute-0 sudo[102620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:36 compute-0 sudo[102620]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:37 compute-0 sudo[102743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwmyeluinfzndctujllsaspccyidtpko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948676.194596-381-51838023185486/AnsiballZ_copy.py'
Nov 24 01:44:37 compute-0 sudo[102743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:37 compute-0 sudo[102743]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:38 compute-0 sudo[102895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvuauqoyairsnqgffjaymxsocsmrknbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948677.6953712-398-52902358875933/AnsiballZ_container_config_data.py'
Nov 24 01:44:38 compute-0 sudo[102895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:38 compute-0 python3.9[102897]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 24 01:44:38 compute-0 sudo[102895]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:39 compute-0 sudo[103047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkqxpgpluldsryqqitmfnmblbtvbwvoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948678.618177-407-260079891972999/AnsiballZ_container_config_hash.py'
Nov 24 01:44:39 compute-0 sudo[103047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:39 compute-0 python3.9[103049]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:44:39 compute-0 sudo[103047]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:40 compute-0 sudo[103199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mytbdhbdkoczvvnyvwjptjwbajriivsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948679.5925672-416-3962554162324/AnsiballZ_podman_container_info.py'
Nov 24 01:44:40 compute-0 sudo[103199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:40 compute-0 python3.9[103201]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 01:44:40 compute-0 sudo[103199]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:41 compute-0 sudo[103377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uekpzvrbbjmkqiymigajzmuedtryjsji ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948680.856347-429-11933621418955/AnsiballZ_edpm_container_manage.py'
Nov 24 01:44:41 compute-0 sudo[103377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:41 compute-0 python3[103379]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:44:41 compute-0 podman[103416]: 2025-11-24 01:44:41.869492706 +0000 UTC m=+0.057981608 container create ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:44:41 compute-0 podman[103416]: 2025-11-24 01:44:41.837847458 +0000 UTC m=+0.026336440 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:44:41 compute-0 python3[103379]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:44:42 compute-0 sudo[103377]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:42 compute-0 sudo[103603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdfvcvjfnxcjcapzkpyyfsiqqsfumdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948682.1928098-437-91385599558835/AnsiballZ_stat.py'
Nov 24 01:44:42 compute-0 sudo[103603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:42 compute-0 python3.9[103605]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:44:42 compute-0 sudo[103603]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:43 compute-0 sudo[103757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kicblhoijsihpvhtzqedhoapinnqctpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948682.962997-446-222048267915819/AnsiballZ_file.py'
Nov 24 01:44:43 compute-0 sudo[103757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:43 compute-0 python3.9[103759]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:43 compute-0 sudo[103757]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:43 compute-0 sudo[103833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgelyhykyetnrgpzncowsqxxvshgkarg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948682.962997-446-222048267915819/AnsiballZ_stat.py'
Nov 24 01:44:43 compute-0 sudo[103833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:43 compute-0 python3.9[103835]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:44:43 compute-0 sudo[103833]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:44 compute-0 sudo[103984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqlquqjgtdvvzioftjasicnyfrfcoqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948684.0075927-446-121941206011012/AnsiballZ_copy.py'
Nov 24 01:44:44 compute-0 sudo[103984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:44 compute-0 python3.9[103986]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763948684.0075927-446-121941206011012/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:44:44 compute-0 sudo[103984]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:44 compute-0 sudo[104060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtcdvkkhivecizdrbnpiybihvtjjnsms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948684.0075927-446-121941206011012/AnsiballZ_systemd.py'
Nov 24 01:44:44 compute-0 sudo[104060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:45 compute-0 python3.9[104062]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:44:45 compute-0 systemd[1]: Reloading.
Nov 24 01:44:45 compute-0 systemd-rc-local-generator[104088]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:45 compute-0 systemd-sysv-generator[104091]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:45 compute-0 sudo[104060]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:45 compute-0 sudo[104172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpmfemafzmrqsuiltxtnqvivlkorszmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948684.0075927-446-121941206011012/AnsiballZ_systemd.py'
Nov 24 01:44:45 compute-0 sudo[104172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:46 compute-0 python3.9[104174]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:44:46 compute-0 systemd[1]: Reloading.
Nov 24 01:44:46 compute-0 systemd-rc-local-generator[104206]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:46 compute-0 systemd-sysv-generator[104209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:46 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 24 01:44:46 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:44:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e8c4a88344434fdfc0d1d481cd5c65e2acc29b688448093a3db979015045977/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 24 01:44:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e8c4a88344434fdfc0d1d481cd5c65e2acc29b688448093a3db979015045977/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:44:46 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d.
Nov 24 01:44:46 compute-0 podman[104216]: 2025-11-24 01:44:46.509304643 +0000 UTC m=+0.139528142 container init ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + sudo -E kolla_set_configs
Nov 24 01:44:46 compute-0 podman[104216]: 2025-11-24 01:44:46.537916372 +0000 UTC m=+0.168139851 container start ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 01:44:46 compute-0 edpm-start-podman-container[104216]: ovn_metadata_agent
Nov 24 01:44:46 compute-0 edpm-start-podman-container[104215]: Creating additional drop-in dependency for "ovn_metadata_agent" (ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d)
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Validating config file
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Copying service configuration files
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Writing out command to execute
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: ++ cat /run_command
Nov 24 01:44:46 compute-0 systemd[1]: Reloading.
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + CMD=neutron-ovn-metadata-agent
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + ARGS=
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + sudo kolla_copy_cacerts
Nov 24 01:44:46 compute-0 podman[104240]: 2025-11-24 01:44:46.638658932 +0000 UTC m=+0.079580593 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + [[ ! -n '' ]]
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + . kolla_extend_start
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: Running command: 'neutron-ovn-metadata-agent'
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + umask 0022
Nov 24 01:44:46 compute-0 ovn_metadata_agent[104233]: + exec neutron-ovn-metadata-agent
Nov 24 01:44:46 compute-0 systemd-rc-local-generator[104300]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:46 compute-0 systemd-sysv-generator[104306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:46 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 24 01:44:46 compute-0 sudo[104172]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:47 compute-0 sshd-session[95983]: Connection closed by 192.168.122.30 port 60260
Nov 24 01:44:47 compute-0 sshd-session[95980]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:44:47 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Nov 24 01:44:47 compute-0 systemd[1]: session-21.scope: Consumed 37.489s CPU time.
Nov 24 01:44:47 compute-0 systemd-logind[791]: Session 21 logged out. Waiting for processes to exit.
Nov 24 01:44:47 compute-0 systemd-logind[791]: Removed session 21.
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.353 104238 INFO neutron.common.config [-] Logging enabled!
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.354 104238 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.354 104238 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.355 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.355 104238 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.355 104238 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.355 104238 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.356 104238 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.357 104238 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.358 104238 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.359 104238 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.360 104238 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.361 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.362 104238 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.363 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.364 104238 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.365 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.366 104238 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.367 104238 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.368 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.369 104238 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.370 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.371 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.372 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.373 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.374 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.375 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.376 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.377 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.378 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.379 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.380 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.381 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.382 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.383 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.384 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.385 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.386 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.387 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.388 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.389 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.390 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.391 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.392 104238 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.393 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.394 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.395 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.396 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.397 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.398 104238 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.398 104238 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.406 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.406 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.406 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.407 104238 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.407 104238 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.418 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e8ad7b7b-7799-4041-b082-e8facd56e34a (UUID: e8ad7b7b-7799-4041-b082-e8facd56e34a) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.443 104238 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.444 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.444 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.444 104238 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.447 104238 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.452 104238 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.457 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e8ad7b7b-7799-4041-b082-e8facd56e34a'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], external_ids={}, name=e8ad7b7b-7799-4041-b082-e8facd56e34a, nb_cfg_timestamp=1763948638535, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.458 104238 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f3cb2d83160>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.459 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.459 104238 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.459 104238 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.459 104238 INFO oslo_service.service [-] Starting 1 workers
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.463 104238 DEBUG oslo_service.service [-] Started child 104342 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.467 104342 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-161854'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.467 104238 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp_yvvqdpi/privsep.sock']
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.488 104342 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.488 104342 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.488 104342 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.492 104342 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.497 104342 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 01:44:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.502 104342 INFO eventlet.wsgi.server [-] (104342) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 24 01:44:48 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.106 104238 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.107 104238 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_yvvqdpi/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.987 104347 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.995 104347 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:48.999 104347 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.000 104347 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104347
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.110 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[229f7aef-bc58-4131-a552-bcbd227a3141]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.584 104347 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.585 104347 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:44:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:49.585 104347 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.100 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[46391de4-3609-465e-9623-1b77ecd9d2f2]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.103 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, column=external_ids, values=({'neutron:ovn-metadata-id': 'fd2d8d27-e56a-5fb7-a860-8d86fcae1ad5'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.112 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.118 104238 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.118 104238 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.119 104238 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.119 104238 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.119 104238 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.119 104238 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.119 104238 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.120 104238 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.120 104238 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.120 104238 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.120 104238 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.120 104238 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.121 104238 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.121 104238 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.121 104238 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.121 104238 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.122 104238 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.123 104238 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.123 104238 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.123 104238 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.123 104238 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.124 104238 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.124 104238 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.124 104238 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.124 104238 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.124 104238 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.125 104238 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.125 104238 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.125 104238 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.125 104238 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.125 104238 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.126 104238 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.126 104238 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.126 104238 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.126 104238 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.127 104238 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.128 104238 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.129 104238 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.130 104238 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.131 104238 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.132 104238 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.133 104238 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.133 104238 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.133 104238 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.133 104238 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.133 104238 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.134 104238 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.135 104238 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.135 104238 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.135 104238 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.135 104238 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.135 104238 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.136 104238 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.136 104238 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.136 104238 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.136 104238 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.136 104238 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.137 104238 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.138 104238 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.139 104238 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.140 104238 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.140 104238 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.140 104238 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.140 104238 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.140 104238 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.141 104238 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.142 104238 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.143 104238 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.144 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.145 104238 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.146 104238 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.147 104238 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.148 104238 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.149 104238 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.150 104238 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.151 104238 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.152 104238 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.153 104238 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.154 104238 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.155 104238 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.156 104238 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.157 104238 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.158 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.159 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.160 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.161 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:44:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:44:50.162 104238 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:44:51 compute-0 podman[104352]: 2025-11-24 01:44:51.842415508 +0000 UTC m=+0.092105805 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:44:52 compute-0 sshd-session[104379]: Accepted publickey for zuul from 192.168.122.30 port 42278 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:44:52 compute-0 systemd-logind[791]: New session 22 of user zuul.
Nov 24 01:44:52 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 24 01:44:52 compute-0 sshd-session[104379]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:44:53 compute-0 python3.9[104532]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:44:54 compute-0 sudo[104686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfqvacmglrujddruhuqdgxbxxtwvwhdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948694.3920512-34-199179390140693/AnsiballZ_command.py'
Nov 24 01:44:54 compute-0 sudo[104686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:55 compute-0 python3.9[104688]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:44:55 compute-0 sudo[104686]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:56 compute-0 sudo[104850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyjippqimacjqxffcjnbzvmpozhggoec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948695.4685733-45-160203719611303/AnsiballZ_systemd_service.py'
Nov 24 01:44:56 compute-0 sudo[104850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:44:56 compute-0 python3.9[104852]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:44:56 compute-0 systemd[1]: Reloading.
Nov 24 01:44:56 compute-0 systemd-rc-local-generator[104882]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:44:56 compute-0 systemd-sysv-generator[104886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:44:56 compute-0 sudo[104850]: pam_unix(sudo:session): session closed for user root
Nov 24 01:44:57 compute-0 python3.9[105039]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:44:57 compute-0 network[105057]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:44:57 compute-0 network[105058]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:44:57 compute-0 network[105059]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:44:58 compute-0 sshd-session[104912]: Invalid user user from 80.94.95.115 port 55066
Nov 24 01:44:58 compute-0 sshd-session[104912]: Connection closed by invalid user user 80.94.95.115 port 55066 [preauth]
Nov 24 01:45:01 compute-0 sudo[105318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlvheaziznmvefddhhvqflgxppidgjby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948700.8735032-64-174715264917241/AnsiballZ_systemd_service.py'
Nov 24 01:45:01 compute-0 sudo[105318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:01 compute-0 python3.9[105320]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:01 compute-0 sudo[105318]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:01 compute-0 sudo[105471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipktjvbgallwxhkonazstquzivkokust ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948701.5427258-64-270093078878981/AnsiballZ_systemd_service.py'
Nov 24 01:45:01 compute-0 sudo[105471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:02 compute-0 python3.9[105473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:02 compute-0 sudo[105471]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:02 compute-0 sudo[105624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcyplabvsuqawxdmlwnasajkcbyhdjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948702.2995918-64-279796640458874/AnsiballZ_systemd_service.py'
Nov 24 01:45:02 compute-0 sudo[105624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:02 compute-0 python3.9[105626]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:02 compute-0 sudo[105624]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:03 compute-0 sudo[105777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbbpctqsigbcikhfgbxoagvvofsrwfej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948703.0790508-64-16756307874751/AnsiballZ_systemd_service.py'
Nov 24 01:45:03 compute-0 sudo[105777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:03 compute-0 python3.9[105779]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:03 compute-0 sudo[105777]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:04 compute-0 sudo[105930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaevlipilrruiemqbpxvrcmnramgfncm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948703.861047-64-111056095002598/AnsiballZ_systemd_service.py'
Nov 24 01:45:04 compute-0 sudo[105930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:04 compute-0 python3.9[105932]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:04 compute-0 sudo[105930]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:05 compute-0 sudo[106083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzsicgcfarznfqwtswdeduqeregabyqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948704.6803367-64-86385645705564/AnsiballZ_systemd_service.py'
Nov 24 01:45:05 compute-0 sudo[106083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:05 compute-0 python3.9[106085]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:05 compute-0 sudo[106083]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:05 compute-0 sudo[106236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-podstbzpulfvewnsyorumwagakdltsyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948705.541183-64-126817490455501/AnsiballZ_systemd_service.py'
Nov 24 01:45:05 compute-0 sudo[106236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:06 compute-0 python3.9[106238]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:45:06 compute-0 sudo[106236]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:07 compute-0 sudo[106389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slrjprpyioykmmdehegyjnsscsjdrfxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948706.6087894-116-258615285396327/AnsiballZ_file.py'
Nov 24 01:45:07 compute-0 sudo[106389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:07 compute-0 python3.9[106391]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:07 compute-0 sudo[106389]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:07 compute-0 sudo[106541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqxsjvgikfetdtafqoznutrapqknsots ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948707.464775-116-10591276261390/AnsiballZ_file.py'
Nov 24 01:45:07 compute-0 sudo[106541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:08 compute-0 python3.9[106543]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:08 compute-0 sudo[106541]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:08 compute-0 sudo[106693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekeydtdgqcclhmxpzbaqawrmsdjyizqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948708.2162724-116-87830534064336/AnsiballZ_file.py'
Nov 24 01:45:08 compute-0 sudo[106693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:08 compute-0 python3.9[106695]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:08 compute-0 sudo[106693]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:09 compute-0 sudo[106845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvyphhfespwrynpfqabuylhhervgutrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948708.7887826-116-211074086812678/AnsiballZ_file.py'
Nov 24 01:45:09 compute-0 sudo[106845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:09 compute-0 python3.9[106847]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:09 compute-0 sudo[106845]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:09 compute-0 sudo[106997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euvvjjylqywupaqicfklifieqjdddzvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948709.4262183-116-251564072829218/AnsiballZ_file.py'
Nov 24 01:45:09 compute-0 sudo[106997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:09 compute-0 python3.9[106999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:09 compute-0 sudo[106997]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:10 compute-0 sudo[107149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mewrudanhwkqrqefxtfcxrbkwhuckmat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948710.1285942-116-149608339991251/AnsiballZ_file.py'
Nov 24 01:45:10 compute-0 sudo[107149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:10 compute-0 python3.9[107151]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:10 compute-0 sudo[107149]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:11 compute-0 sudo[107301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiqhzrtjquqnwfddrgmoyhojoxlatahu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948710.806365-116-31337752066415/AnsiballZ_file.py'
Nov 24 01:45:11 compute-0 sudo[107301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:11 compute-0 python3.9[107303]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:11 compute-0 sudo[107301]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:11 compute-0 sudo[107453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epltigflaszydsjihqngvnxydsvlpxtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948711.5164056-166-123061176642949/AnsiballZ_file.py'
Nov 24 01:45:11 compute-0 sudo[107453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:11 compute-0 python3.9[107455]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:11 compute-0 sudo[107453]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:12 compute-0 sudo[107605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goonhcsvtukgtsksjsiookiuqxxdqlqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948712.1239355-166-165606213519182/AnsiballZ_file.py'
Nov 24 01:45:12 compute-0 sudo[107605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:12 compute-0 python3.9[107607]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:12 compute-0 sudo[107605]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:13 compute-0 sudo[107757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgkqhlfmkuxftqylnrxohrvblbcwfhve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948712.9037387-166-1821269452109/AnsiballZ_file.py'
Nov 24 01:45:13 compute-0 sudo[107757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:13 compute-0 python3.9[107759]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:13 compute-0 sudo[107757]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:13 compute-0 sudo[107909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydigauttmzalsesehoyurhngckttqxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948713.531663-166-127191337470812/AnsiballZ_file.py'
Nov 24 01:45:13 compute-0 sudo[107909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:14 compute-0 python3.9[107911]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:14 compute-0 sudo[107909]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:14 compute-0 sudo[108061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgneyuvsleiepxzbwnxmsrjjpaxqqkzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948714.2273388-166-114935545078885/AnsiballZ_file.py'
Nov 24 01:45:14 compute-0 sudo[108061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:14 compute-0 python3.9[108063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:14 compute-0 sudo[108061]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:15 compute-0 sudo[108213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjfpudszhvoizrgsvwvqdjncljjbwuiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948714.8347561-166-42073630274218/AnsiballZ_file.py'
Nov 24 01:45:15 compute-0 sudo[108213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:15 compute-0 python3.9[108215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:15 compute-0 sudo[108213]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:15 compute-0 sudo[108365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvztafpytfglspxgqxivpdftwfzgxfxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948715.4991603-166-147932869523576/AnsiballZ_file.py'
Nov 24 01:45:15 compute-0 sudo[108365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:15 compute-0 python3.9[108367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:45:15 compute-0 sudo[108365]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:16 compute-0 sudo[108517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kelzaxvhlftozoqvohbqdrygjjykazsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948716.2452319-217-114667429484297/AnsiballZ_command.py'
Nov 24 01:45:16 compute-0 sudo[108517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:16 compute-0 python3.9[108519]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:16 compute-0 sudo[108517]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:17 compute-0 podman[108645]: 2025-11-24 01:45:17.490216051 +0000 UTC m=+0.094892447 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:45:17 compute-0 python3.9[108680]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:45:18 compute-0 sudo[108837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbfhrlmhrjxfvyvpwvygdhbzaolkrdzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948717.8997133-235-226094031072422/AnsiballZ_systemd_service.py'
Nov 24 01:45:18 compute-0 sudo[108837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:18 compute-0 python3.9[108839]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:45:18 compute-0 systemd[1]: Reloading.
Nov 24 01:45:18 compute-0 systemd-sysv-generator[108868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:45:18 compute-0 systemd-rc-local-generator[108861]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:45:18 compute-0 sudo[108837]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:19 compute-0 sudo[109023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbbwcseanebgsyvmospeiorfyfvxvube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948718.973944-243-234108708996878/AnsiballZ_command.py'
Nov 24 01:45:19 compute-0 sudo[109023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:19 compute-0 python3.9[109025]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:19 compute-0 sudo[109023]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:19 compute-0 sudo[109176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpjgpcjvvodyfdtiharneezhsswrwcrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948719.66351-243-260857844484235/AnsiballZ_command.py'
Nov 24 01:45:19 compute-0 sudo[109176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:20 compute-0 python3.9[109178]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:20 compute-0 sudo[109176]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:20 compute-0 sudo[109329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbsplycyumhpphfejlpnqajieirkbfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948720.3363128-243-211202818495140/AnsiballZ_command.py'
Nov 24 01:45:20 compute-0 sudo[109329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:20 compute-0 python3.9[109331]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:20 compute-0 sudo[109329]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:21 compute-0 sudo[109482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptmbuflyudnrbuystsxxnkqsousupybc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948721.0901253-243-172513099689575/AnsiballZ_command.py'
Nov 24 01:45:21 compute-0 sudo[109482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:21 compute-0 python3.9[109484]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:21 compute-0 sudo[109482]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:22 compute-0 sudo[109646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyscrjmqtokicckziehnveiqaihmkjpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948721.827451-243-168739482128947/AnsiballZ_command.py'
Nov 24 01:45:22 compute-0 sudo[109646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:22 compute-0 podman[109609]: 2025-11-24 01:45:22.187730406 +0000 UTC m=+0.121027713 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 01:45:22 compute-0 python3.9[109650]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:22 compute-0 sudo[109646]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:22 compute-0 sudo[109814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qftldidykdpmalbrwtvalmbkyynugzrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948722.4640791-243-369528791475/AnsiballZ_command.py'
Nov 24 01:45:22 compute-0 sudo[109814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:22 compute-0 python3.9[109816]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:23 compute-0 sudo[109814]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:23 compute-0 sudo[109967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uufyxihhzgxyscscfqbptwligfphotxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948723.1463265-243-128308245373368/AnsiballZ_command.py'
Nov 24 01:45:23 compute-0 sudo[109967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:23 compute-0 python3.9[109969]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:45:23 compute-0 sudo[109967]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:24 compute-0 sudo[110120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqugbrtbxcfhzubgblhoxcbmpohxajgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948724.0281322-297-38575233854648/AnsiballZ_getent.py'
Nov 24 01:45:24 compute-0 sudo[110120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:24 compute-0 python3.9[110122]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 24 01:45:24 compute-0 sudo[110120]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:25 compute-0 sudo[110273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whjyuwmmpjvrdmscmwpwuhpnwowdnwrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948724.963757-305-192479770444428/AnsiballZ_group.py'
Nov 24 01:45:25 compute-0 sudo[110273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:25 compute-0 python3.9[110275]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:45:25 compute-0 groupadd[110276]: group added to /etc/group: name=libvirt, GID=42473
Nov 24 01:45:25 compute-0 groupadd[110276]: group added to /etc/gshadow: name=libvirt
Nov 24 01:45:25 compute-0 groupadd[110276]: new group: name=libvirt, GID=42473
Nov 24 01:45:25 compute-0 sudo[110273]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:26 compute-0 sudo[110433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qadkeovswmejktofiuzptshjgwnejhmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948725.8964763-313-206684621919978/AnsiballZ_user.py'
Nov 24 01:45:26 compute-0 sudo[110433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:26 compute-0 python3.9[110435]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 01:45:26 compute-0 useradd[110437]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 01:45:26 compute-0 sshd-session[110330]: Received disconnect from 46.188.119.26 port 32824:11: Bye Bye [preauth]
Nov 24 01:45:26 compute-0 sshd-session[110330]: Disconnected from authenticating user root 46.188.119.26 port 32824 [preauth]
Nov 24 01:45:26 compute-0 sudo[110433]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:27 compute-0 sudo[110593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiphpzcjidktnbalbujfxaukkndllcfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948727.1162884-324-79760097500461/AnsiballZ_setup.py'
Nov 24 01:45:27 compute-0 sudo[110593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:27 compute-0 python3.9[110595]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:45:28 compute-0 sudo[110593]: pam_unix(sudo:session): session closed for user root
Nov 24 01:45:28 compute-0 sudo[110677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcmywrwjdvrnjgelbpbmcwhssyusfuyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948727.1162884-324-79760097500461/AnsiballZ_dnf.py'
Nov 24 01:45:28 compute-0 sudo[110677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:45:28 compute-0 python3.9[110679]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:45:42 compute-0 sshd-session[110797]: Received disconnect from 203.83.238.175 port 50636:11:  [preauth]
Nov 24 01:45:42 compute-0 sshd-session[110797]: Disconnected from authenticating user root 203.83.238.175 port 50636 [preauth]
Nov 24 01:45:47 compute-0 podman[110872]: 2025-11-24 01:45:47.85380623 +0000 UTC m=+0.088399565 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:45:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:45:48.403 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:45:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:45:48.404 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:45:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:45:48.404 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:45:52 compute-0 podman[110893]: 2025-11-24 01:45:52.891852524 +0000 UTC m=+0.133171454 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 24 01:45:56 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:45:56 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:46:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:46:18 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 24 01:46:18 compute-0 podman[110934]: 2025-11-24 01:46:18.858927442 +0000 UTC m=+0.066465828 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:46:23 compute-0 podman[113408]: 2025-11-24 01:46:23.882183191 +0000 UTC m=+0.121553188 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 24 01:46:44 compute-0 sshd-session[124422]: Received disconnect from 46.188.119.26 port 33154:11: Bye Bye [preauth]
Nov 24 01:46:44 compute-0 sshd-session[124422]: Disconnected from authenticating user root 46.188.119.26 port 33154 [preauth]
Nov 24 01:46:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:46:48.403 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:46:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:46:48.403 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:46:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:46:48.403 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:46:49 compute-0 podman[127762]: 2025-11-24 01:46:49.800186656 +0000 UTC m=+0.058516265 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 01:46:54 compute-0 podman[127795]: 2025-11-24 01:46:54.874523679 +0000 UTC m=+0.118145893 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:47:02 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 01:47:02 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 01:47:03 compute-0 groupadd[127836]: group added to /etc/group: name=dnsmasq, GID=992
Nov 24 01:47:03 compute-0 groupadd[127836]: group added to /etc/gshadow: name=dnsmasq
Nov 24 01:47:03 compute-0 groupadd[127836]: new group: name=dnsmasq, GID=992
Nov 24 01:47:03 compute-0 useradd[127843]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 24 01:47:03 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:47:03 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 24 01:47:03 compute-0 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Nov 24 01:47:04 compute-0 groupadd[127856]: group added to /etc/group: name=clevis, GID=991
Nov 24 01:47:04 compute-0 groupadd[127856]: group added to /etc/gshadow: name=clevis
Nov 24 01:47:04 compute-0 groupadd[127856]: new group: name=clevis, GID=991
Nov 24 01:47:04 compute-0 useradd[127863]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 24 01:47:04 compute-0 usermod[127873]: add 'clevis' to group 'tss'
Nov 24 01:47:04 compute-0 usermod[127873]: add 'clevis' to shadow group 'tss'
Nov 24 01:47:07 compute-0 polkitd[43547]: Reloading rules
Nov 24 01:47:07 compute-0 polkitd[43547]: Collecting garbage unconditionally...
Nov 24 01:47:07 compute-0 polkitd[43547]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 01:47:07 compute-0 polkitd[43547]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 01:47:07 compute-0 polkitd[43547]: Finished loading, compiling and executing 3 rules
Nov 24 01:47:07 compute-0 polkitd[43547]: Reloading rules
Nov 24 01:47:07 compute-0 polkitd[43547]: Collecting garbage unconditionally...
Nov 24 01:47:07 compute-0 polkitd[43547]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 01:47:07 compute-0 polkitd[43547]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 01:47:07 compute-0 polkitd[43547]: Finished loading, compiling and executing 3 rules
Nov 24 01:47:08 compute-0 groupadd[128060]: group added to /etc/group: name=ceph, GID=167
Nov 24 01:47:08 compute-0 groupadd[128060]: group added to /etc/gshadow: name=ceph
Nov 24 01:47:08 compute-0 groupadd[128060]: new group: name=ceph, GID=167
Nov 24 01:47:08 compute-0 useradd[128066]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 24 01:47:11 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 24 01:47:11 compute-0 sshd[1005]: Received signal 15; terminating.
Nov 24 01:47:11 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 24 01:47:11 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 24 01:47:11 compute-0 systemd[1]: sshd.service: Consumed 2.069s CPU time, read 32.0K from disk, written 8.0K to disk.
Nov 24 01:47:11 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 24 01:47:11 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 24 01:47:11 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:47:11 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:47:11 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 01:47:11 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 24 01:47:11 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 24 01:47:11 compute-0 sshd[128585]: Server listening on 0.0.0.0 port 22.
Nov 24 01:47:11 compute-0 sshd[128585]: Server listening on :: port 22.
Nov 24 01:47:11 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 24 01:47:13 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:47:13 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:47:13 compute-0 systemd[1]: Reloading.
Nov 24 01:47:13 compute-0 systemd-rc-local-generator[128843]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:13 compute-0 systemd-sysv-generator[128846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:13 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:47:16 compute-0 sudo[110677]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:17 compute-0 sudo[132543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmgrkxevkvsnzybgeysgmwzgvzxtsih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948836.5510361-336-126508837717999/AnsiballZ_systemd.py'
Nov 24 01:47:17 compute-0 sudo[132543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:17 compute-0 python3.9[132568]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:47:17 compute-0 systemd[1]: Reloading.
Nov 24 01:47:17 compute-0 systemd-sysv-generator[133133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:17 compute-0 systemd-rc-local-generator[133122]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:17 compute-0 sudo[132543]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:18 compute-0 sudo[133993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpyggdidazzpmguruxehpozhmdwgaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948838.0620763-336-15791524776325/AnsiballZ_systemd.py'
Nov 24 01:47:18 compute-0 sudo[133993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:18 compute-0 python3.9[134014]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:47:18 compute-0 systemd[1]: Reloading.
Nov 24 01:47:18 compute-0 systemd-sysv-generator[134538]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:18 compute-0 systemd-rc-local-generator[134534]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:19 compute-0 sudo[133993]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:19 compute-0 sudo[135267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfgvudlxlewklfjbrqeoqyywnjqckbns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948839.3001578-336-114105582769151/AnsiballZ_systemd.py'
Nov 24 01:47:19 compute-0 sudo[135267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:19 compute-0 python3.9[135289]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:47:19 compute-0 systemd[1]: Reloading.
Nov 24 01:47:19 compute-0 podman[135526]: 2025-11-24 01:47:19.986551573 +0000 UTC m=+0.064774617 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 01:47:20 compute-0 systemd-rc-local-generator[135673]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:20 compute-0 systemd-sysv-generator[135677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:20 compute-0 sudo[135267]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:21 compute-0 sudo[136807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyoltekjczrnlbxuxgurxrgkcntfctkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948840.4049475-336-261354231824007/AnsiballZ_systemd.py'
Nov 24 01:47:21 compute-0 sudo[136807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:21 compute-0 python3.9[136828]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:47:21 compute-0 systemd[1]: Reloading.
Nov 24 01:47:21 compute-0 systemd-rc-local-generator[137275]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:21 compute-0 systemd-sysv-generator[137281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:21 compute-0 sudo[136807]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:22 compute-0 sudo[138041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwbkbcbhquahohenovkatusddmyizeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948841.860156-365-75292425721641/AnsiballZ_systemd.py'
Nov 24 01:47:22 compute-0 sudo[138041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:22 compute-0 python3.9[138043]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:47:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:47:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 11.126s CPU time.
Nov 24 01:47:22 compute-0 systemd[1]: run-rafd34ce522ea4f859b5d739e57b868b7.service: Deactivated successfully.
Nov 24 01:47:22 compute-0 systemd[1]: Reloading.
Nov 24 01:47:22 compute-0 systemd-sysv-generator[138194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:22 compute-0 systemd-rc-local-generator[138191]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:22 compute-0 sudo[138041]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:23 compute-0 sudo[138348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlufnylbtovitsksxonqdunblymheafv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948843.014174-365-87850517925860/AnsiballZ_systemd.py'
Nov 24 01:47:23 compute-0 sudo[138348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:23 compute-0 python3.9[138350]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:23 compute-0 systemd[1]: Reloading.
Nov 24 01:47:23 compute-0 systemd-rc-local-generator[138376]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:23 compute-0 systemd-sysv-generator[138379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:24 compute-0 sudo[138348]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:24 compute-0 sudo[138537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntihcakdtggigvsjzmgaoqcejpczhalk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948844.142064-365-225472877925291/AnsiballZ_systemd.py'
Nov 24 01:47:24 compute-0 sudo[138537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:24 compute-0 python3.9[138539]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:25 compute-0 systemd[1]: Reloading.
Nov 24 01:47:25 compute-0 podman[138542]: 2025-11-24 01:47:25.851402009 +0000 UTC m=+0.092186782 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 01:47:25 compute-0 systemd-rc-local-generator[138596]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:25 compute-0 systemd-sysv-generator[138600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:26 compute-0 sudo[138537]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:26 compute-0 sudo[138753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chndepbidfzzlxgkoqjwxzlgpwskldra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948846.2517416-365-206232260552293/AnsiballZ_systemd.py'
Nov 24 01:47:26 compute-0 sudo[138753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:26 compute-0 python3.9[138755]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:27 compute-0 sudo[138753]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:27 compute-0 sudo[138908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhfhijfsfgqxsoazflrywzwbidnhnfma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948847.15945-365-169567991276438/AnsiballZ_systemd.py'
Nov 24 01:47:27 compute-0 sudo[138908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:27 compute-0 python3.9[138910]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:27 compute-0 systemd[1]: Reloading.
Nov 24 01:47:27 compute-0 systemd-sysv-generator[138944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:27 compute-0 systemd-rc-local-generator[138941]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:28 compute-0 sudo[138908]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:28 compute-0 sudo[139100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufovxjqypwxokqvarondoggiqcxrwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948848.3094935-401-78157544959676/AnsiballZ_systemd.py'
Nov 24 01:47:28 compute-0 sudo[139100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:28 compute-0 python3.9[139102]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 01:47:28 compute-0 systemd[1]: Reloading.
Nov 24 01:47:29 compute-0 systemd-sysv-generator[139136]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:47:29 compute-0 systemd-rc-local-generator[139131]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:47:29 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 24 01:47:29 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 24 01:47:29 compute-0 sudo[139100]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:29 compute-0 sudo[139293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqhcwnbsfiwwpgwuoupkqtdehgdijfci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948849.5268478-409-143484820892187/AnsiballZ_systemd.py'
Nov 24 01:47:29 compute-0 sudo[139293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:30 compute-0 python3.9[139295]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:30 compute-0 sudo[139293]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:30 compute-0 sudo[139448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozbknkrbkbjpspgtimxhiuxreptsoprh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948850.3823075-409-74543710985269/AnsiballZ_systemd.py'
Nov 24 01:47:30 compute-0 sudo[139448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:31 compute-0 python3.9[139450]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:31 compute-0 sudo[139448]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:31 compute-0 sudo[139603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuawgialzaeuendbvuviftsrjfmoksmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948851.3076782-409-210802481420934/AnsiballZ_systemd.py'
Nov 24 01:47:31 compute-0 sudo[139603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:31 compute-0 python3.9[139605]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:31 compute-0 sudo[139603]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:32 compute-0 sudo[139758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dborhedjyxrkqjgjshqdzlcmpfinwptx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948852.0747964-409-95262810855018/AnsiballZ_systemd.py'
Nov 24 01:47:32 compute-0 sudo[139758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:32 compute-0 python3.9[139760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:32 compute-0 sudo[139758]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:33 compute-0 sudo[139913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufgelwllkiqsxpxkvqucwgzrzgvayqxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948852.9342368-409-33271228944882/AnsiballZ_systemd.py'
Nov 24 01:47:33 compute-0 sudo[139913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:33 compute-0 python3.9[139915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:33 compute-0 sudo[139913]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:34 compute-0 sudo[140068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njpyhqawjclepbirbgqjotvdktmnndwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948853.7937572-409-164989121309339/AnsiballZ_systemd.py'
Nov 24 01:47:34 compute-0 sudo[140068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:34 compute-0 python3.9[140070]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:34 compute-0 sudo[140068]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:34 compute-0 sudo[140223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klflfpbcvroqhtlfdyxuseujhrpgbryd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948854.6226666-409-255584623831152/AnsiballZ_systemd.py'
Nov 24 01:47:34 compute-0 sudo[140223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:35 compute-0 python3.9[140225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:35 compute-0 sudo[140223]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:35 compute-0 sudo[140378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dczzqlfycmeeibbgkhaxwarrrvlksqbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948855.4096766-409-104955506433541/AnsiballZ_systemd.py'
Nov 24 01:47:35 compute-0 sudo[140378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:36 compute-0 python3.9[140380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:36 compute-0 sudo[140378]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:36 compute-0 sudo[140533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvhyzwwvyxeryoneuofuchuzsilumjbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948856.3268378-409-46290895729149/AnsiballZ_systemd.py'
Nov 24 01:47:36 compute-0 sudo[140533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:37 compute-0 python3.9[140535]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:37 compute-0 sudo[140533]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:37 compute-0 sudo[140688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itglxxvmprqdhygdedezlovkpftvrlah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948857.334263-409-214857763944029/AnsiballZ_systemd.py'
Nov 24 01:47:37 compute-0 sudo[140688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:37 compute-0 python3.9[140690]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:38 compute-0 sudo[140688]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:38 compute-0 sudo[140843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrzqsuxnjyvdqdpmwdeuouuzpgbgzwuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948858.214274-409-260652139623413/AnsiballZ_systemd.py'
Nov 24 01:47:38 compute-0 sudo[140843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:38 compute-0 python3.9[140845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:38 compute-0 sudo[140843]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:39 compute-0 sudo[140998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfxyypupdqvmaqexbpelfvkqwdiktgie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948859.0773351-409-272388320753766/AnsiballZ_systemd.py'
Nov 24 01:47:39 compute-0 sudo[140998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:39 compute-0 python3.9[141000]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:39 compute-0 sudo[140998]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:40 compute-0 sudo[141153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxhdpvrqccwrwzowfjcesxgtiusyjiwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948859.9734628-409-183800764100304/AnsiballZ_systemd.py'
Nov 24 01:47:40 compute-0 sudo[141153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:40 compute-0 python3.9[141155]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:40 compute-0 sudo[141153]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:41 compute-0 sudo[141308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqcygoziictjvpionbojmtcieljpwlqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948860.8316948-409-160430411355388/AnsiballZ_systemd.py'
Nov 24 01:47:41 compute-0 sudo[141308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:41 compute-0 python3.9[141310]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 01:47:41 compute-0 sudo[141308]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:42 compute-0 sudo[141463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haoihjogsdiueywqcadikppzcwtmdwer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948861.9728663-511-56976800241706/AnsiballZ_file.py'
Nov 24 01:47:42 compute-0 sudo[141463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:42 compute-0 python3.9[141465]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:42 compute-0 sudo[141463]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:42 compute-0 sudo[141615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnuqctkxkmnjyxszjjcivggyyojyuzxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948862.6268296-511-142319638068544/AnsiballZ_file.py'
Nov 24 01:47:42 compute-0 sudo[141615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:43 compute-0 python3.9[141617]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:43 compute-0 sudo[141615]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:43 compute-0 sudo[141767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clduqrmydwzljkztbrxahnaxfrrexkpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948863.3328278-511-83170255967668/AnsiballZ_file.py'
Nov 24 01:47:43 compute-0 sudo[141767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:43 compute-0 python3.9[141769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:43 compute-0 sudo[141767]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:44 compute-0 sudo[141919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgblpagnodmoywaeudjxbgtdmdzbaomb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948864.0265632-511-172450145986260/AnsiballZ_file.py'
Nov 24 01:47:44 compute-0 sudo[141919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:44 compute-0 python3.9[141921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:44 compute-0 sudo[141919]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:44 compute-0 sudo[142071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbbgmevdcvijtdeoedzqvuxjyegericj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948864.6710484-511-97860761561072/AnsiballZ_file.py'
Nov 24 01:47:44 compute-0 sudo[142071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:45 compute-0 python3.9[142073]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:45 compute-0 sudo[142071]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:45 compute-0 sudo[142223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sacmqvhoosllkdvzwxzlvnksnpoobhed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948865.34722-511-247237777097966/AnsiballZ_file.py'
Nov 24 01:47:45 compute-0 sudo[142223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:45 compute-0 python3.9[142225]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:47:45 compute-0 sudo[142223]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:46 compute-0 sudo[142375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrqypjhcemxhfiyoyqdeeanxukysebnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948866.0068457-554-54440818324644/AnsiballZ_stat.py'
Nov 24 01:47:46 compute-0 sudo[142375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:46 compute-0 python3.9[142377]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:46 compute-0 sudo[142375]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:47 compute-0 sudo[142500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cczhkpwxygxygdyedltunhdkpnchoata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948866.0068457-554-54440818324644/AnsiballZ_copy.py'
Nov 24 01:47:47 compute-0 sudo[142500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:47 compute-0 python3.9[142502]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948866.0068457-554-54440818324644/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:47 compute-0 sudo[142500]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:47 compute-0 sudo[142652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeoahgbfhrqbbniktghegazlsmvvnsvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948867.6457295-554-183728977625719/AnsiballZ_stat.py'
Nov 24 01:47:47 compute-0 sudo[142652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:48 compute-0 python3.9[142654]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:48 compute-0 sudo[142652]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:47:48.403 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:47:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:47:48.405 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:47:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:47:48.405 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:47:48 compute-0 sudo[142777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqtacrjyxenldocmpqyttiwpxmgjmspn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948867.6457295-554-183728977625719/AnsiballZ_copy.py'
Nov 24 01:47:48 compute-0 sudo[142777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:48 compute-0 python3.9[142779]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948867.6457295-554-183728977625719/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:48 compute-0 sudo[142777]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:49 compute-0 sudo[142929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgwojgvidpmskwzipzqbaqvpntazclrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948868.9909973-554-165084855471341/AnsiballZ_stat.py'
Nov 24 01:47:49 compute-0 sudo[142929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:49 compute-0 python3.9[142931]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:49 compute-0 sudo[142929]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:49 compute-0 sudo[143054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgjrspozrqgushwbyphhkoczuhhwadsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948868.9909973-554-165084855471341/AnsiballZ_copy.py'
Nov 24 01:47:49 compute-0 sudo[143054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:50 compute-0 python3.9[143056]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948868.9909973-554-165084855471341/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:50 compute-0 sudo[143054]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:50 compute-0 sudo[143216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufvmzmwfidavcodcgmriulzyomazwuqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948870.304795-554-200246281981024/AnsiballZ_stat.py'
Nov 24 01:47:50 compute-0 sudo[143216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:50 compute-0 podman[143180]: 2025-11-24 01:47:50.646733711 +0000 UTC m=+0.056568360 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:47:50 compute-0 python3.9[143224]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:50 compute-0 sudo[143216]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:51 compute-0 sudo[143350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgdckxrgfltgheuycgnjwgxilzsccrfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948870.304795-554-200246281981024/AnsiballZ_copy.py'
Nov 24 01:47:51 compute-0 sudo[143350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:51 compute-0 python3.9[143352]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948870.304795-554-200246281981024/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:51 compute-0 sudo[143350]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:52 compute-0 sudo[143502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvvaxwzwiwvgbwbmuqxcfarhtudprjuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948871.7675867-554-27619626316759/AnsiballZ_stat.py'
Nov 24 01:47:52 compute-0 sudo[143502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:52 compute-0 python3.9[143504]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:52 compute-0 sudo[143502]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:52 compute-0 sudo[143627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmgstpnazqzjmbzxbrgchvdgwqrxshxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948871.7675867-554-27619626316759/AnsiballZ_copy.py'
Nov 24 01:47:52 compute-0 sudo[143627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:52 compute-0 python3.9[143629]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948871.7675867-554-27619626316759/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:53 compute-0 sudo[143627]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:53 compute-0 sudo[143779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhhypzmtwytuodfpxsgvxhztypiwukrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948873.1550903-554-185244670499241/AnsiballZ_stat.py'
Nov 24 01:47:53 compute-0 sudo[143779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:53 compute-0 python3.9[143781]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:53 compute-0 sudo[143779]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:54 compute-0 sudo[143904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfopdmwufwxwqxwcydpnupuwgitojhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948873.1550903-554-185244670499241/AnsiballZ_copy.py'
Nov 24 01:47:54 compute-0 sudo[143904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:54 compute-0 python3.9[143906]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948873.1550903-554-185244670499241/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:54 compute-0 sudo[143904]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:54 compute-0 sudo[144056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkfgwonkfjpsgshykahgtwlemlzswvvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948874.3880422-554-135543958633444/AnsiballZ_stat.py'
Nov 24 01:47:54 compute-0 sudo[144056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:54 compute-0 python3.9[144058]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:54 compute-0 sudo[144056]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:55 compute-0 sudo[144179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whffdjunjpaseuoqtjovysquihknnsjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948874.3880422-554-135543958633444/AnsiballZ_copy.py'
Nov 24 01:47:55 compute-0 sudo[144179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:55 compute-0 python3.9[144181]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948874.3880422-554-135543958633444/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:55 compute-0 sudo[144179]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:56 compute-0 sudo[144331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aregqpzkafcpwpepqdiovojtvbptbnas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948875.6870081-554-32589870977752/AnsiballZ_stat.py'
Nov 24 01:47:56 compute-0 sudo[144331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:56 compute-0 python3.9[144333]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:47:56 compute-0 sudo[144331]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:56 compute-0 sudo[144473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgffwldqwgtgtciprvylpywjdntcidy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948875.6870081-554-32589870977752/AnsiballZ_copy.py'
Nov 24 01:47:56 compute-0 sudo[144473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:56 compute-0 podman[144430]: 2025-11-24 01:47:56.670758817 +0000 UTC m=+0.103124728 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 24 01:47:56 compute-0 python3.9[144480]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763948875.6870081-554-32589870977752/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:56 compute-0 sudo[144473]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:57 compute-0 sudo[144635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfqvzfypyflhigwhccmrocqgedwxtxpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948877.1032414-667-13779977219639/AnsiballZ_command.py'
Nov 24 01:47:57 compute-0 sudo[144635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:57 compute-0 python3.9[144637]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 24 01:47:57 compute-0 sudo[144635]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:58 compute-0 sudo[144788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbczmqiuiyvwwzvyplfwfapxvdqliglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948877.8592544-676-70971780225617/AnsiballZ_file.py'
Nov 24 01:47:58 compute-0 sudo[144788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:58 compute-0 python3.9[144790]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:58 compute-0 sudo[144788]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:58 compute-0 sudo[144940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nztwwfwjlzzlarsgngrppyhhauocuntw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948878.5867195-676-239510599058794/AnsiballZ_file.py'
Nov 24 01:47:58 compute-0 sudo[144940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:59 compute-0 python3.9[144942]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:59 compute-0 sudo[144940]: pam_unix(sudo:session): session closed for user root
Nov 24 01:47:59 compute-0 sudo[145094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgvlxfehmafcdpplcgkqdkagdedikoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948879.2249713-676-190406232304751/AnsiballZ_file.py'
Nov 24 01:47:59 compute-0 sudo[145094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:47:59 compute-0 python3.9[145096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:47:59 compute-0 sudo[145094]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:00 compute-0 sshd-session[145063]: Invalid user work from 46.188.119.26 port 33482
Nov 24 01:48:00 compute-0 sudo[145246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pymtlnafagfyrwidhkhjpbnkekzlahwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948879.924429-676-18719004055641/AnsiballZ_file.py'
Nov 24 01:48:00 compute-0 sudo[145246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:00 compute-0 sshd-session[145063]: Received disconnect from 46.188.119.26 port 33482:11: Bye Bye [preauth]
Nov 24 01:48:00 compute-0 sshd-session[145063]: Disconnected from invalid user work 46.188.119.26 port 33482 [preauth]
Nov 24 01:48:00 compute-0 python3.9[145248]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:00 compute-0 sudo[145246]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:00 compute-0 sudo[145398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekcscaisfbvkprjbrnmvudfngxgbzako ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948880.6106915-676-157704678935772/AnsiballZ_file.py'
Nov 24 01:48:00 compute-0 sudo[145398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:01 compute-0 python3.9[145400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:01 compute-0 sudo[145398]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:01 compute-0 sudo[145550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twhedchxnwsftoenahybacqolgdexvcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948881.2800732-676-94751458189943/AnsiballZ_file.py'
Nov 24 01:48:01 compute-0 sudo[145550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:01 compute-0 python3.9[145552]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:01 compute-0 sudo[145550]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:02 compute-0 sudo[145702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cusjarkskpvwpfiyqaxigjvdlppyyjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948882.051732-676-227214724692961/AnsiballZ_file.py'
Nov 24 01:48:02 compute-0 sudo[145702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:02 compute-0 python3.9[145704]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:02 compute-0 sudo[145702]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:02 compute-0 sudo[145854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbjfqjriowqqlnjrjrlvbsiexyjvptfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948882.7236872-676-250140957355709/AnsiballZ_file.py'
Nov 24 01:48:02 compute-0 sudo[145854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:03 compute-0 python3.9[145856]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:03 compute-0 sudo[145854]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:03 compute-0 sudo[146006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqeglqymwjslgusfqrxanhxxaladxyfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948883.3380227-676-210338705676066/AnsiballZ_file.py'
Nov 24 01:48:03 compute-0 sudo[146006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:03 compute-0 python3.9[146008]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:03 compute-0 sudo[146006]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:04 compute-0 sudo[146158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zehvxyvusytttjythxxtdwrxteapdpnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948883.9822474-676-219467055146350/AnsiballZ_file.py'
Nov 24 01:48:04 compute-0 sudo[146158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:04 compute-0 python3.9[146160]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:04 compute-0 sudo[146158]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:05 compute-0 sudo[146310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myvcktuzmjpvmrdwydzbfowerparuebl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948884.7181263-676-252743481751291/AnsiballZ_file.py'
Nov 24 01:48:05 compute-0 sudo[146310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:05 compute-0 python3.9[146312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:05 compute-0 sudo[146310]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:05 compute-0 sudo[146462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgkvwldpkvsxkwfxrjmerwnffausbevs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948885.4138253-676-105994731396967/AnsiballZ_file.py'
Nov 24 01:48:05 compute-0 sudo[146462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:05 compute-0 python3.9[146464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:05 compute-0 sudo[146462]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:06 compute-0 sudo[146614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jurlffhqkmrvfyetzmnyaagnaeprepcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948886.083898-676-20520362087170/AnsiballZ_file.py'
Nov 24 01:48:06 compute-0 sudo[146614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:06 compute-0 python3.9[146616]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:06 compute-0 sudo[146614]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:07 compute-0 sudo[146766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmfyrntvatacymfffcgrqewraduhdsax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948886.7596483-676-69835981370846/AnsiballZ_file.py'
Nov 24 01:48:07 compute-0 sudo[146766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:07 compute-0 python3.9[146768]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:07 compute-0 sudo[146766]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:07 compute-0 sudo[146918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uohiyvfbzotuhxutmcvtnhmqubrrppmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948887.568896-775-13734740368399/AnsiballZ_stat.py'
Nov 24 01:48:07 compute-0 sudo[146918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:08 compute-0 python3.9[146920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:08 compute-0 sudo[146918]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:08 compute-0 sudo[147041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukkmicxkxumntfhtuwmgpjaqxrnpwxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948887.568896-775-13734740368399/AnsiballZ_copy.py'
Nov 24 01:48:08 compute-0 sudo[147041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:08 compute-0 python3.9[147043]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948887.568896-775-13734740368399/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:08 compute-0 sudo[147041]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:09 compute-0 sudo[147193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcjebcdbcwbvaqsdhkxgcjxlhbbutoyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948888.79517-775-51314927918072/AnsiballZ_stat.py'
Nov 24 01:48:09 compute-0 sudo[147193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:09 compute-0 python3.9[147195]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:09 compute-0 sudo[147193]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:09 compute-0 sudo[147316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbrledureogvpxehyllxivxohwshxorf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948888.79517-775-51314927918072/AnsiballZ_copy.py'
Nov 24 01:48:09 compute-0 sudo[147316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:09 compute-0 python3.9[147318]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948888.79517-775-51314927918072/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:09 compute-0 sudo[147316]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:10 compute-0 sudo[147468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dapjbnsdnxzxlkwhbrjcycumhqyhozul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948890.051354-775-11134207706171/AnsiballZ_stat.py'
Nov 24 01:48:10 compute-0 sudo[147468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:10 compute-0 python3.9[147470]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:10 compute-0 sudo[147468]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:10 compute-0 sudo[147591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cosnvylfezabktwuwqtcnepqqrloszrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948890.051354-775-11134207706171/AnsiballZ_copy.py'
Nov 24 01:48:10 compute-0 sudo[147591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:11 compute-0 python3.9[147593]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948890.051354-775-11134207706171/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:11 compute-0 sudo[147591]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:11 compute-0 sudo[147743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctdcuttsqicftsmjxfbsnfzbshxvabuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948891.3115547-775-177231572069803/AnsiballZ_stat.py'
Nov 24 01:48:11 compute-0 sudo[147743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:11 compute-0 python3.9[147745]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:11 compute-0 sudo[147743]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:12 compute-0 sudo[147866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laxfoitcodxqqksvnxdpnisocmzufmso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948891.3115547-775-177231572069803/AnsiballZ_copy.py'
Nov 24 01:48:12 compute-0 sudo[147866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:12 compute-0 python3.9[147868]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948891.3115547-775-177231572069803/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:12 compute-0 sudo[147866]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:12 compute-0 sudo[148018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wicneobdgeqntpqjytnvctdnqjgqxcnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948892.5318897-775-258587283675897/AnsiballZ_stat.py'
Nov 24 01:48:12 compute-0 sudo[148018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:13 compute-0 python3.9[148020]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:13 compute-0 sudo[148018]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:13 compute-0 sudo[148141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulxkvcfufbzjmpslqkjrwzxxcgimxrhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948892.5318897-775-258587283675897/AnsiballZ_copy.py'
Nov 24 01:48:13 compute-0 sudo[148141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:13 compute-0 python3.9[148143]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948892.5318897-775-258587283675897/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:13 compute-0 sudo[148141]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:14 compute-0 sudo[148293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwkcqsuxhevhwllwfretebihtogdvlqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948893.8326538-775-48328330524260/AnsiballZ_stat.py'
Nov 24 01:48:14 compute-0 sudo[148293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:14 compute-0 python3.9[148295]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:14 compute-0 sudo[148293]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:14 compute-0 sudo[148416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siqgesanupetgmjoxoxjljxfuomevfuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948893.8326538-775-48328330524260/AnsiballZ_copy.py'
Nov 24 01:48:14 compute-0 sudo[148416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:14 compute-0 python3.9[148418]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948893.8326538-775-48328330524260/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:14 compute-0 sudo[148416]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:15 compute-0 sudo[148568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zowlmisbsbbeiccfpddxbgpjszsjdrfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948895.0737495-775-11484197880382/AnsiballZ_stat.py'
Nov 24 01:48:15 compute-0 sudo[148568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:15 compute-0 python3.9[148570]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:15 compute-0 sudo[148568]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:15 compute-0 sudo[148691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwgqnqgztdtetfjswyyaznvwwuwkesz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948895.0737495-775-11484197880382/AnsiballZ_copy.py'
Nov 24 01:48:15 compute-0 sudo[148691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:16 compute-0 python3.9[148693]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948895.0737495-775-11484197880382/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:16 compute-0 sudo[148691]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:16 compute-0 sudo[148843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iigegtaspiislorptkmrrrjqohbkpene ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948896.285355-775-210994522026008/AnsiballZ_stat.py'
Nov 24 01:48:16 compute-0 sudo[148843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:16 compute-0 python3.9[148845]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:16 compute-0 sudo[148843]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:17 compute-0 sudo[148966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqhcxaqtoyuxxzyqwptjgeinuyiihnzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948896.285355-775-210994522026008/AnsiballZ_copy.py'
Nov 24 01:48:17 compute-0 sudo[148966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:17 compute-0 python3.9[148968]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948896.285355-775-210994522026008/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:17 compute-0 sudo[148966]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:17 compute-0 sudo[149118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqzhrmpzkbwxohzhrbipemkslpeetfzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948897.534077-775-276054460822785/AnsiballZ_stat.py'
Nov 24 01:48:17 compute-0 sudo[149118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:18 compute-0 python3.9[149120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:18 compute-0 sudo[149118]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:18 compute-0 sudo[149241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-likqqlvearjlbajwewrnjetadjyahzbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948897.534077-775-276054460822785/AnsiballZ_copy.py'
Nov 24 01:48:18 compute-0 sudo[149241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:18 compute-0 python3.9[149243]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948897.534077-775-276054460822785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:18 compute-0 sudo[149241]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:19 compute-0 sudo[149393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yptukaffugazjjmpzvifvqilzkdakncl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948898.7741456-775-140399957414690/AnsiballZ_stat.py'
Nov 24 01:48:19 compute-0 sudo[149393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:19 compute-0 python3.9[149395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:19 compute-0 sudo[149393]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:20 compute-0 sudo[149516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zusknppqpywoxhoplnmgwfrszxasjcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948898.7741456-775-140399957414690/AnsiballZ_copy.py'
Nov 24 01:48:20 compute-0 sudo[149516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:20 compute-0 python3.9[149518]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948898.7741456-775-140399957414690/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:20 compute-0 sudo[149516]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:20 compute-0 sudo[149668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbtloaftijrhvtkqylonydpgfeqdauxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948900.3746555-775-86101115667269/AnsiballZ_stat.py'
Nov 24 01:48:20 compute-0 sudo[149668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:20 compute-0 podman[149670]: 2025-11-24 01:48:20.814804495 +0000 UTC m=+0.088981675 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 01:48:20 compute-0 python3.9[149671]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:20 compute-0 sudo[149668]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:21 compute-0 sudo[149812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yknrnlxpxtvcjwpyhoylmkopauztqkog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948900.3746555-775-86101115667269/AnsiballZ_copy.py'
Nov 24 01:48:21 compute-0 sudo[149812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:22 compute-0 python3.9[149814]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948900.3746555-775-86101115667269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:22 compute-0 sudo[149812]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:22 compute-0 sudo[149964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-instrdgdibkepnthrwiulgiloyorfndh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948902.2880495-775-276047140959467/AnsiballZ_stat.py'
Nov 24 01:48:22 compute-0 sudo[149964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:22 compute-0 python3.9[149966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:22 compute-0 sudo[149964]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:23 compute-0 sudo[150087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocucldffvnzjqmnbyflxovbodespgxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948902.2880495-775-276047140959467/AnsiballZ_copy.py'
Nov 24 01:48:23 compute-0 sudo[150087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:23 compute-0 python3.9[150089]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948902.2880495-775-276047140959467/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:23 compute-0 sudo[150087]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:23 compute-0 sudo[150239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcmkwyfegsovkcxivbuxvhayocjeywfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948903.5157351-775-244269174301800/AnsiballZ_stat.py'
Nov 24 01:48:23 compute-0 sudo[150239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:24 compute-0 python3.9[150241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:24 compute-0 sudo[150239]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:24 compute-0 sudo[150362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwvrtrfvzmavbanvzhkcpwqeuyovxvht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948903.5157351-775-244269174301800/AnsiballZ_copy.py'
Nov 24 01:48:24 compute-0 sudo[150362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:24 compute-0 python3.9[150364]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948903.5157351-775-244269174301800/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:24 compute-0 sudo[150362]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:25 compute-0 sudo[150514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxbxfkwxppadaamacxukafafjypcokpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948904.8685665-775-190045161277308/AnsiballZ_stat.py'
Nov 24 01:48:25 compute-0 sudo[150514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:25 compute-0 python3.9[150516]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:25 compute-0 sudo[150514]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:25 compute-0 sudo[150637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtmyqedfxhweyqnzgavocxvxoyrkwcaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948904.8685665-775-190045161277308/AnsiballZ_copy.py'
Nov 24 01:48:25 compute-0 sudo[150637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:25 compute-0 python3.9[150639]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948904.8685665-775-190045161277308/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:25 compute-0 sudo[150637]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:26 compute-0 python3.9[150789]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:48:26 compute-0 podman[150817]: 2025-11-24 01:48:26.831417804 +0000 UTC m=+0.078230462 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:48:27 compute-0 sudo[150968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pneqgouzwusgbbckximrqjywhtsfqcpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948906.8865328-981-273314101161784/AnsiballZ_seboolean.py'
Nov 24 01:48:27 compute-0 sudo[150968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:27 compute-0 python3.9[150970]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 24 01:48:28 compute-0 sudo[150968]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:29 compute-0 sudo[151124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unmywfyafnnumvxcklhgnmvdvxuucarq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948908.9762435-989-138039845217188/AnsiballZ_copy.py'
Nov 24 01:48:29 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 24 01:48:29 compute-0 sudo[151124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:29 compute-0 python3.9[151126]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:29 compute-0 sudo[151124]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:30 compute-0 sudo[151276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvhoeuabmyoervkjwljydlqqvxzvbkwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948909.7507246-989-205650285091211/AnsiballZ_copy.py'
Nov 24 01:48:30 compute-0 sudo[151276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:30 compute-0 python3.9[151278]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:30 compute-0 sudo[151276]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:30 compute-0 sudo[151428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrdpgpxsrbymqpbvgsryaptlrhzldova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948910.4203236-989-130215095140337/AnsiballZ_copy.py'
Nov 24 01:48:30 compute-0 sudo[151428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:30 compute-0 python3.9[151430]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:30 compute-0 sudo[151428]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:31 compute-0 sudo[151580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjyqinmjrrgcqrbrujrbitfbcetomqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948911.0440433-989-222553262953767/AnsiballZ_copy.py'
Nov 24 01:48:31 compute-0 sudo[151580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:31 compute-0 python3.9[151582]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:31 compute-0 sudo[151580]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:32 compute-0 sudo[151732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaakowhaqxaryjxaylfhcsltjrsxfcdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948911.7034616-989-7825221272089/AnsiballZ_copy.py'
Nov 24 01:48:32 compute-0 sudo[151732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:32 compute-0 python3.9[151734]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:32 compute-0 sudo[151732]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:32 compute-0 sudo[151884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxdzhwphzznsfjtrsvtuxzunxvooiqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948912.4220266-1025-34724315008577/AnsiballZ_copy.py'
Nov 24 01:48:32 compute-0 sudo[151884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:32 compute-0 python3.9[151886]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:32 compute-0 sudo[151884]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:33 compute-0 sudo[152036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfiliqhajeihzjormcgonrmwbychhzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948913.1185331-1025-117913356915247/AnsiballZ_copy.py'
Nov 24 01:48:33 compute-0 sudo[152036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:33 compute-0 python3.9[152038]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:33 compute-0 sudo[152036]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:34 compute-0 sudo[152188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugmryqsyphvshuyxsssledkminarbdiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948913.8390331-1025-281352780847012/AnsiballZ_copy.py'
Nov 24 01:48:34 compute-0 sudo[152188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:34 compute-0 python3.9[152190]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:34 compute-0 sudo[152188]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:34 compute-0 sudo[152340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctfqkzdqkojqltlmflmhdnfqbcdxqvjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948914.5432153-1025-106792010920385/AnsiballZ_copy.py'
Nov 24 01:48:34 compute-0 sudo[152340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:35 compute-0 python3.9[152342]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:35 compute-0 sudo[152340]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:35 compute-0 sudo[152492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qafduerpyhbusxstipfcndqjjndxysus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948915.2865863-1025-277303551699199/AnsiballZ_copy.py'
Nov 24 01:48:35 compute-0 sudo[152492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:35 compute-0 python3.9[152494]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:35 compute-0 sudo[152492]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:36 compute-0 sudo[152644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcgdhhmcyfreeusbqhshjqieoytsfeeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948916.0612078-1061-85474321850868/AnsiballZ_systemd.py'
Nov 24 01:48:36 compute-0 sudo[152644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:36 compute-0 python3.9[152646]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:48:36 compute-0 systemd[1]: Reloading.
Nov 24 01:48:36 compute-0 systemd-rc-local-generator[152671]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:48:36 compute-0 systemd-sysv-generator[152677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:48:37 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 24 01:48:37 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 24 01:48:37 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 24 01:48:37 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 24 01:48:37 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 24 01:48:37 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 24 01:48:37 compute-0 sudo[152644]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:37 compute-0 sudo[152837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zftezvfbxfpibhbkrijybksdwviioomm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948917.4184847-1061-76674262916847/AnsiballZ_systemd.py'
Nov 24 01:48:37 compute-0 sudo[152837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:38 compute-0 python3.9[152839]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:48:38 compute-0 systemd[1]: Reloading.
Nov 24 01:48:38 compute-0 systemd-rc-local-generator[152866]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:48:38 compute-0 systemd-sysv-generator[152870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:48:38 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 24 01:48:38 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 24 01:48:38 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 24 01:48:38 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 24 01:48:38 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 24 01:48:38 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 24 01:48:38 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 01:48:38 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 24 01:48:38 compute-0 sudo[152837]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:38 compute-0 sudo[153055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvtffflkubrmgionjdlbnlqacuhozdbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948918.5701904-1061-150653775607952/AnsiballZ_systemd.py'
Nov 24 01:48:38 compute-0 sudo[153055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:39 compute-0 python3.9[153057]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:48:39 compute-0 systemd[1]: Reloading.
Nov 24 01:48:39 compute-0 systemd-sysv-generator[153084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:48:39 compute-0 systemd-rc-local-generator[153078]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:48:39 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 24 01:48:39 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 24 01:48:39 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 24 01:48:39 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 24 01:48:39 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 24 01:48:39 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 24 01:48:39 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 24 01:48:39 compute-0 sudo[153055]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:39 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 24 01:48:39 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 24 01:48:40 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 24 01:48:40 compute-0 sshd-session[152902]: Received disconnect from 154.90.59.75 port 44596:11: Bye Bye [preauth]
Nov 24 01:48:40 compute-0 sshd-session[152902]: Disconnected from authenticating user root 154.90.59.75 port 44596 [preauth]
Nov 24 01:48:40 compute-0 sudo[153274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enzrtcphduvrcbeyberfpvbcrltgxwcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948919.7937477-1061-227962972733112/AnsiballZ_systemd.py'
Nov 24 01:48:40 compute-0 sudo[153274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:40 compute-0 python3.9[153276]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:48:40 compute-0 systemd[1]: Reloading.
Nov 24 01:48:40 compute-0 systemd-rc-local-generator[153309]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:48:40 compute-0 systemd-sysv-generator[153312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:48:40 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 24 01:48:40 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 24 01:48:40 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 24 01:48:40 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 24 01:48:40 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 24 01:48:40 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 24 01:48:40 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 24 01:48:40 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 24 01:48:40 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 24 01:48:40 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 24 01:48:40 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 01:48:40 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 24 01:48:40 compute-0 sudo[153274]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:41 compute-0 setroubleshoot[153093]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 15387ae7-7bb0-40fb-8964-784c751a2d93
Nov 24 01:48:41 compute-0 setroubleshoot[153093]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 01:48:41 compute-0 setroubleshoot[153093]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 15387ae7-7bb0-40fb-8964-784c751a2d93
Nov 24 01:48:41 compute-0 setroubleshoot[153093]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 01:48:41 compute-0 sudo[153493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeglumjmuaglxywtoztxszxjukgdamnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948921.0042343-1061-98460278601472/AnsiballZ_systemd.py'
Nov 24 01:48:41 compute-0 sudo[153493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:41 compute-0 python3.9[153495]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:48:41 compute-0 systemd[1]: Reloading.
Nov 24 01:48:41 compute-0 systemd-rc-local-generator[153521]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:48:41 compute-0 systemd-sysv-generator[153525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:48:41 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 24 01:48:41 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 24 01:48:41 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 24 01:48:41 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 24 01:48:41 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 24 01:48:41 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 24 01:48:41 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 24 01:48:41 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 24 01:48:41 compute-0 sudo[153493]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:42 compute-0 sudo[153705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dixgjauqqjwfqdyfsopqwaymafqsmzrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948922.2077904-1098-32164570042990/AnsiballZ_file.py'
Nov 24 01:48:42 compute-0 sudo[153705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:42 compute-0 python3.9[153707]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:42 compute-0 sudo[153705]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:43 compute-0 sudo[153857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fclkirdiglaswcabmorkidqktkwcojuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948922.8930237-1106-112418269887556/AnsiballZ_find.py'
Nov 24 01:48:43 compute-0 sudo[153857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:43 compute-0 python3.9[153859]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:48:43 compute-0 sudo[153857]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:44 compute-0 sudo[154009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tofyhgpaeqmjepdgieqhrunphnyocqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948923.8464274-1120-98465152947533/AnsiballZ_stat.py'
Nov 24 01:48:44 compute-0 sudo[154009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:44 compute-0 python3.9[154011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:44 compute-0 sudo[154009]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:44 compute-0 sudo[154132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzikjnetiqitjfytuaqadjpvhufbbzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948923.8464274-1120-98465152947533/AnsiballZ_copy.py'
Nov 24 01:48:44 compute-0 sudo[154132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:44 compute-0 python3.9[154134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948923.8464274-1120-98465152947533/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:44 compute-0 sudo[154132]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:45 compute-0 sudo[154284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbvukhrkhwrpfmwwzrbmlgqchyvvpqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948925.2528608-1136-232367702515442/AnsiballZ_file.py'
Nov 24 01:48:45 compute-0 sudo[154284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:45 compute-0 python3.9[154286]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:45 compute-0 sudo[154284]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:46 compute-0 sudo[154436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmytetsgsudafhqarxzllktudbgvgofc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948926.0626452-1144-142995350797017/AnsiballZ_stat.py'
Nov 24 01:48:46 compute-0 sudo[154436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:46 compute-0 python3.9[154438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:46 compute-0 sudo[154436]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:46 compute-0 sudo[154514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxlouckseyysitfvxmfggrpesbwpluju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948926.0626452-1144-142995350797017/AnsiballZ_file.py'
Nov 24 01:48:46 compute-0 sudo[154514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:46 compute-0 python3.9[154516]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:47 compute-0 sudo[154514]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:47 compute-0 sudo[154666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewpichhzvzemlndpndogbhptcysdwtsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948927.1951702-1156-135232314457201/AnsiballZ_stat.py'
Nov 24 01:48:47 compute-0 sudo[154666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:47 compute-0 python3.9[154668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:47 compute-0 sudo[154666]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:47 compute-0 sudo[154744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giibratrycxtpxbftewhbmfwohssoatg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948927.1951702-1156-135232314457201/AnsiballZ_file.py'
Nov 24 01:48:47 compute-0 sudo[154744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:48 compute-0 python3.9[154746]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.cu0qn6ox recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:48 compute-0 sudo[154744]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:48:48.405 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:48:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:48:48.407 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:48:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:48:48.407 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:48:48 compute-0 sudo[154896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbipiueqabgrijnicgrvkkhiazuotaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948928.2953424-1168-67874834976721/AnsiballZ_stat.py'
Nov 24 01:48:48 compute-0 sudo[154896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:48 compute-0 python3.9[154898]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:48 compute-0 sudo[154896]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:48 compute-0 sudo[154974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwppajchjnewtaqhmssabautzdshetuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948928.2953424-1168-67874834976721/AnsiballZ_file.py'
Nov 24 01:48:48 compute-0 sudo[154974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:49 compute-0 python3.9[154976]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:49 compute-0 sudo[154974]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:49 compute-0 sudo[155126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owzcnvrxepmsiiicesetfrdccltqtcth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948929.5197012-1181-77835716226652/AnsiballZ_command.py'
Nov 24 01:48:49 compute-0 sudo[155126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:50 compute-0 python3.9[155128]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:48:50 compute-0 sudo[155126]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:50 compute-0 sudo[155279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynieqkbibrnomgytlnxrizmryvxrzowl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763948930.269173-1189-20041798458199/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 01:48:50 compute-0 sudo[155279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:50 compute-0 python3[155281]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 01:48:50 compute-0 sudo[155279]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:51 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 24 01:48:51 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.021s CPU time.
Nov 24 01:48:51 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 24 01:48:51 compute-0 podman[155306]: 2025-11-24 01:48:51.21455738 +0000 UTC m=+0.077087529 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:48:51 compute-0 sudo[155449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhkyfnvgylajxapdiegfzqstdhevwnej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948931.1940825-1197-122275531573892/AnsiballZ_stat.py'
Nov 24 01:48:51 compute-0 sudo[155449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:51 compute-0 python3.9[155451]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:51 compute-0 sudo[155449]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:52 compute-0 sudo[155527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blnnzfgldpwibbqhfljihxtdmqyvoymz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948931.1940825-1197-122275531573892/AnsiballZ_file.py'
Nov 24 01:48:52 compute-0 sudo[155527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:52 compute-0 python3.9[155529]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:52 compute-0 sudo[155527]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:52 compute-0 sudo[155679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxaqowjwnhjzcconewmcoidgvkjgotva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948932.4806845-1209-50859205250046/AnsiballZ_stat.py'
Nov 24 01:48:52 compute-0 sudo[155679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:53 compute-0 python3.9[155681]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:53 compute-0 sudo[155679]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:53 compute-0 sudo[155757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ingzvnfjueghrzlvcdjysfdfhfpyfcyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948932.4806845-1209-50859205250046/AnsiballZ_file.py'
Nov 24 01:48:53 compute-0 sudo[155757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:53 compute-0 python3.9[155759]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:53 compute-0 sudo[155757]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:53 compute-0 sudo[155909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tajreimyorrrvbfstwttxechfpxnumhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948933.6725745-1221-263948401735059/AnsiballZ_stat.py'
Nov 24 01:48:53 compute-0 sudo[155909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:54 compute-0 python3.9[155911]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:54 compute-0 sudo[155909]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:54 compute-0 sudo[155987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faxayfpokschckkcuqlewqrlmilxoyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948933.6725745-1221-263948401735059/AnsiballZ_file.py'
Nov 24 01:48:54 compute-0 sudo[155987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:54 compute-0 python3.9[155989]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:54 compute-0 sudo[155987]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:55 compute-0 sudo[156139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irxugljwzacudcjelhuiuxkgyttfdexe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948934.8857749-1233-251004852748618/AnsiballZ_stat.py'
Nov 24 01:48:55 compute-0 sudo[156139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:55 compute-0 python3.9[156141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:55 compute-0 sudo[156139]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:55 compute-0 sudo[156217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yatwcbwxsbjjwwuuomcgtoeipjqmfmki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948934.8857749-1233-251004852748618/AnsiballZ_file.py'
Nov 24 01:48:55 compute-0 sudo[156217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:55 compute-0 python3.9[156219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:55 compute-0 sudo[156217]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:56 compute-0 sudo[156369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lamfvnyrupnbmjqkccuxbaxzyjtkmcqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948936.0130785-1245-84927162252689/AnsiballZ_stat.py'
Nov 24 01:48:56 compute-0 sudo[156369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:56 compute-0 python3.9[156371]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:48:56 compute-0 sudo[156369]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:56 compute-0 sudo[156511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngfakudzvdzqsohvsjmdrsbqmitdqtiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948936.0130785-1245-84927162252689/AnsiballZ_copy.py'
Nov 24 01:48:56 compute-0 sudo[156511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:56 compute-0 podman[156468]: 2025-11-24 01:48:56.995721395 +0000 UTC m=+0.092590790 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:48:57 compute-0 python3.9[156516]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763948936.0130785-1245-84927162252689/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:57 compute-0 sudo[156511]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:57 compute-0 sudo[156672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsgpmfybmkbgfimszduedfqkpskfafds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948937.3490605-1260-10132114423658/AnsiballZ_file.py'
Nov 24 01:48:57 compute-0 sudo[156672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:57 compute-0 python3.9[156674]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:57 compute-0 sudo[156672]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:58 compute-0 sudo[156824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chthyausenvczcntktatchnnljbfcctn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948938.1441512-1268-207729839800271/AnsiballZ_command.py'
Nov 24 01:48:58 compute-0 sudo[156824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:58 compute-0 python3.9[156826]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:48:58 compute-0 sudo[156824]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:59 compute-0 sudo[156979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrnhecthgyaaxzkghrnzmvjdjfxmmbvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948938.8557143-1276-204045427004838/AnsiballZ_blockinfile.py'
Nov 24 01:48:59 compute-0 sudo[156979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:48:59 compute-0 python3.9[156981]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:48:59 compute-0 sudo[156979]: pam_unix(sudo:session): session closed for user root
Nov 24 01:48:59 compute-0 sudo[157131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsmrgvfjmauulbranixeamkkafvfvfap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948939.7232492-1285-258093131870334/AnsiballZ_command.py'
Nov 24 01:48:59 compute-0 sudo[157131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:00 compute-0 python3.9[157133]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:49:00 compute-0 sudo[157131]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:00 compute-0 sudo[157284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwclgqsytkklmeaggpxamkgnoeflpcrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948940.3484857-1293-155005133372459/AnsiballZ_stat.py'
Nov 24 01:49:00 compute-0 sudo[157284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:00 compute-0 python3.9[157286]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:00 compute-0 sudo[157284]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:01 compute-0 sudo[157438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaikscfyxndxuxhruqevgaujahdbebtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948941.0088782-1301-35202987853981/AnsiballZ_command.py'
Nov 24 01:49:01 compute-0 sudo[157438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:01 compute-0 python3.9[157440]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:49:01 compute-0 sudo[157438]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:01 compute-0 sudo[157593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzmpfyplzbuuhjjoczxnlsrlanqtwcow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948941.6589923-1309-240955809524968/AnsiballZ_file.py'
Nov 24 01:49:01 compute-0 sudo[157593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:02 compute-0 python3.9[157595]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:02 compute-0 sudo[157593]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:02 compute-0 sudo[157745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzsiqiwkedclaarkxnzakuxbybjkocok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948942.3136506-1317-233797550144841/AnsiballZ_stat.py'
Nov 24 01:49:02 compute-0 sudo[157745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:02 compute-0 python3.9[157747]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:02 compute-0 sudo[157745]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:03 compute-0 sudo[157868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugobvqfiryciqmbrlvbokrpkqjwdvwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948942.3136506-1317-233797550144841/AnsiballZ_copy.py'
Nov 24 01:49:03 compute-0 sudo[157868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:03 compute-0 python3.9[157870]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948942.3136506-1317-233797550144841/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:03 compute-0 sudo[157868]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:03 compute-0 sudo[158020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whsjqbppmdsaazlgbjkoiqigznrfurzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948943.5432136-1332-18409596011683/AnsiballZ_stat.py'
Nov 24 01:49:03 compute-0 sudo[158020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:04 compute-0 python3.9[158022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:04 compute-0 sudo[158020]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:04 compute-0 sudo[158143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxvfzhpilmrqnpbdkcvzfidnjupnfrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948943.5432136-1332-18409596011683/AnsiballZ_copy.py'
Nov 24 01:49:04 compute-0 sudo[158143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:04 compute-0 python3.9[158145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948943.5432136-1332-18409596011683/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:04 compute-0 sudo[158143]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:05 compute-0 sudo[158295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmghywzmdyisdbpadhrmsoykbsqecats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948944.80206-1347-75098048404436/AnsiballZ_stat.py'
Nov 24 01:49:05 compute-0 sudo[158295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:05 compute-0 python3.9[158297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:05 compute-0 sudo[158295]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:05 compute-0 sudo[158418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhflklqyvtxhjxmaddvawpcmcdnalcye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948944.80206-1347-75098048404436/AnsiballZ_copy.py'
Nov 24 01:49:05 compute-0 sudo[158418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:05 compute-0 python3.9[158420]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948944.80206-1347-75098048404436/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:05 compute-0 sudo[158418]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:06 compute-0 sudo[158570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svdsvyehkdpfpmlmkpcqpkibbnjbquom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948946.0771828-1362-39695347017292/AnsiballZ_systemd.py'
Nov 24 01:49:06 compute-0 sudo[158570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:06 compute-0 python3.9[158572]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:49:06 compute-0 systemd[1]: Reloading.
Nov 24 01:49:06 compute-0 systemd-rc-local-generator[158601]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:49:06 compute-0 systemd-sysv-generator[158606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:49:06 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 24 01:49:07 compute-0 sudo[158570]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:07 compute-0 sudo[158761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxlxdjnkzydozpfhtkachehdmrdgkxgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948947.1588082-1370-242119936014899/AnsiballZ_systemd.py'
Nov 24 01:49:07 compute-0 sudo[158761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:07 compute-0 python3.9[158763]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 01:49:07 compute-0 systemd[1]: Reloading.
Nov 24 01:49:07 compute-0 systemd-rc-local-generator[158789]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:49:07 compute-0 systemd-sysv-generator[158792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:49:08 compute-0 systemd[1]: Reloading.
Nov 24 01:49:08 compute-0 systemd-sysv-generator[158831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:49:08 compute-0 systemd-rc-local-generator[158827]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:49:08 compute-0 sudo[158761]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:08 compute-0 sshd-session[104382]: Connection closed by 192.168.122.30 port 42278
Nov 24 01:49:08 compute-0 sshd-session[104379]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:49:08 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 24 01:49:08 compute-0 systemd[1]: session-22.scope: Consumed 3min 34.478s CPU time.
Nov 24 01:49:08 compute-0 systemd-logind[791]: Session 22 logged out. Waiting for processes to exit.
Nov 24 01:49:08 compute-0 systemd-logind[791]: Removed session 22.
Nov 24 01:49:14 compute-0 sshd-session[158861]: Accepted publickey for zuul from 192.168.122.30 port 37096 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:49:14 compute-0 systemd-logind[791]: New session 23 of user zuul.
Nov 24 01:49:14 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 24 01:49:14 compute-0 sshd-session[158861]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:49:14 compute-0 sshd-session[158859]: Invalid user tuan from 46.188.119.26 port 33808
Nov 24 01:49:14 compute-0 sshd-session[158859]: Received disconnect from 46.188.119.26 port 33808:11: Bye Bye [preauth]
Nov 24 01:49:14 compute-0 sshd-session[158859]: Disconnected from invalid user tuan 46.188.119.26 port 33808 [preauth]
Nov 24 01:49:15 compute-0 python3.9[159014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:49:17 compute-0 python3.9[159168]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:49:17 compute-0 network[159185]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:49:17 compute-0 network[159186]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:49:17 compute-0 network[159187]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:49:21 compute-0 podman[159383]: 2025-11-24 01:49:21.845735266 +0000 UTC m=+0.089546109 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 01:49:22 compute-0 sudo[159476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvyvvcesoqkoobfdubzychvjplckguxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948961.121137-47-137128202170497/AnsiballZ_setup.py'
Nov 24 01:49:22 compute-0 sudo[159476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:22 compute-0 python3.9[159478]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 01:49:22 compute-0 sudo[159476]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:23 compute-0 sudo[159560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iectydqtxvtejvsursjrwbciquvftxnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948961.121137-47-137128202170497/AnsiballZ_dnf.py'
Nov 24 01:49:23 compute-0 sudo[159560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:23 compute-0 python3.9[159562]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:49:27 compute-0 podman[159564]: 2025-11-24 01:49:27.936102265 +0000 UTC m=+0.165452907 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 24 01:49:28 compute-0 sudo[159560]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:28 compute-0 sudo[159741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmzmusndgmvoebdaikuaghvvmuzkqmec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948968.5470746-59-70439244310338/AnsiballZ_stat.py'
Nov 24 01:49:28 compute-0 sudo[159741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:29 compute-0 python3.9[159743]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:29 compute-0 sudo[159741]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:29 compute-0 sudo[159893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsydrpryimhfztgqzgtnocuffexcwiis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948969.4007516-69-232903549860681/AnsiballZ_command.py'
Nov 24 01:49:29 compute-0 sudo[159893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:30 compute-0 python3.9[159895]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:49:30 compute-0 sudo[159893]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:30 compute-0 sudo[160046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrxowuklnpjruououivkqgpskefksecd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948970.3601892-79-230172942951457/AnsiballZ_stat.py'
Nov 24 01:49:30 compute-0 sudo[160046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:30 compute-0 python3.9[160048]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:30 compute-0 sudo[160046]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:31 compute-0 sudo[160198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivilciyckzsgvqguqwglfyrhjpkdhhpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948971.0365615-87-218875126695638/AnsiballZ_command.py'
Nov 24 01:49:31 compute-0 sudo[160198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:31 compute-0 python3.9[160200]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:49:31 compute-0 sudo[160198]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:31 compute-0 sudo[160351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpbkclbnactjtnpabowvosvikivmqcov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948971.673785-95-225562870847270/AnsiballZ_stat.py'
Nov 24 01:49:31 compute-0 sudo[160351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:32 compute-0 python3.9[160353]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:32 compute-0 sudo[160351]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:32 compute-0 sudo[160474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awxstsssevxarsbddiqtlhkhmhupnxuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948971.673785-95-225562870847270/AnsiballZ_copy.py'
Nov 24 01:49:32 compute-0 sudo[160474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:32 compute-0 python3.9[160476]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948971.673785-95-225562870847270/.source.iscsi _original_basename=.38_w1asq follow=False checksum=b25a69345dfa17009f1eb3eac19834b51ef61721 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:32 compute-0 sudo[160474]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:33 compute-0 sudo[160626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajyeqvnkukbfwsjhwsqtomhtcbkgstx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948973.0220747-110-241575338715413/AnsiballZ_file.py'
Nov 24 01:49:33 compute-0 sudo[160626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:33 compute-0 python3.9[160628]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:33 compute-0 sudo[160626]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:34 compute-0 sudo[160778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnzbbzvnhgtbwsiqcozxfwfdeyusuiwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948973.905816-118-171261905871395/AnsiballZ_lineinfile.py'
Nov 24 01:49:34 compute-0 sudo[160778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:34 compute-0 python3.9[160780]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:34 compute-0 sudo[160778]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:34 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:49:34 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:49:35 compute-0 sudo[160931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbfcbquujirxwogkrnxmkvfddwpsqngp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948974.7755601-127-57760440041927/AnsiballZ_systemd_service.py'
Nov 24 01:49:35 compute-0 sudo[160931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:35 compute-0 python3.9[160933]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:49:35 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 24 01:49:35 compute-0 sudo[160931]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:36 compute-0 sudo[161087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgummtnohzokppbowhtnpzuvibrvawty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948976.027476-135-268822528332965/AnsiballZ_systemd_service.py'
Nov 24 01:49:36 compute-0 sudo[161087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:36 compute-0 python3.9[161089]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:49:36 compute-0 systemd[1]: Reloading.
Nov 24 01:49:36 compute-0 systemd-sysv-generator[161117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:49:36 compute-0 systemd-rc-local-generator[161114]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:49:37 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 01:49:37 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 24 01:49:37 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 24 01:49:37 compute-0 systemd[1]: Started Open-iSCSI.
Nov 24 01:49:37 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 24 01:49:37 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 24 01:49:37 compute-0 sudo[161087]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:37 compute-0 sudo[161286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aepccyaveoanobuzglmnmkmuxijmljpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948977.5753458-146-94644984305795/AnsiballZ_service_facts.py'
Nov 24 01:49:37 compute-0 sudo[161286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:38 compute-0 python3.9[161288]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:49:38 compute-0 network[161305]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:49:38 compute-0 network[161306]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:49:38 compute-0 network[161307]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:49:42 compute-0 sudo[161286]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:43 compute-0 sudo[161576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arnfaikmytxeimagqvsbeibngywmkses ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948982.8965042-156-47976055319100/AnsiballZ_file.py'
Nov 24 01:49:43 compute-0 sudo[161576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:43 compute-0 python3.9[161578]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 01:49:43 compute-0 sudo[161576]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:44 compute-0 sudo[161728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpoarcjqllsiuwefwefngnmzkwhzzir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948983.6958466-164-126768509232264/AnsiballZ_modprobe.py'
Nov 24 01:49:44 compute-0 sudo[161728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:44 compute-0 python3.9[161730]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 24 01:49:44 compute-0 sudo[161728]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:44 compute-0 sudo[161884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khfpvjcwodfmxkujeoicxbusjacnghzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948984.5375621-172-137088747409593/AnsiballZ_stat.py'
Nov 24 01:49:44 compute-0 sudo[161884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:44 compute-0 python3.9[161886]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:45 compute-0 sudo[161884]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:45 compute-0 sudo[162007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deozrukhctykmupzitqdywghcbdrubng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948984.5375621-172-137088747409593/AnsiballZ_copy.py'
Nov 24 01:49:45 compute-0 sudo[162007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:45 compute-0 python3.9[162009]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948984.5375621-172-137088747409593/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:45 compute-0 sudo[162007]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:46 compute-0 sudo[162159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yintcdgxkjgygyzvgrticqvrhpjextjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948985.8704429-188-276965382606129/AnsiballZ_lineinfile.py'
Nov 24 01:49:46 compute-0 sudo[162159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:46 compute-0 python3.9[162161]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:46 compute-0 sudo[162159]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:47 compute-0 sudo[162311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdumabpsuutlvreiabysguovcfjbrmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948986.6522639-196-258355070664511/AnsiballZ_systemd.py'
Nov 24 01:49:47 compute-0 sudo[162311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:47 compute-0 python3.9[162313]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:49:47 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 01:49:47 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 24 01:49:47 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 24 01:49:47 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 01:49:47 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 01:49:47 compute-0 sudo[162311]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:48 compute-0 sudo[162467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyqmocseqrbldfzthpfgrfngztkalemu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948987.936084-204-157126751103801/AnsiballZ_file.py'
Nov 24 01:49:48 compute-0 sudo[162467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:49:48.406 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:49:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:49:48.407 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:49:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:49:48.407 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:49:48 compute-0 python3.9[162469]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:49:48 compute-0 sudo[162467]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:49 compute-0 sudo[162619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bskffeindyxleenevclvaqjlusxxakgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948988.7814953-213-72079085597484/AnsiballZ_stat.py'
Nov 24 01:49:49 compute-0 sudo[162619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:49 compute-0 python3.9[162621]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:49 compute-0 sudo[162619]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:49 compute-0 sudo[162771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyngoykyuhgoyhzfrwsdgzewunspqqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948989.5646386-222-27244721890818/AnsiballZ_stat.py'
Nov 24 01:49:49 compute-0 sudo[162771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:50 compute-0 python3.9[162773]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:50 compute-0 sudo[162771]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:50 compute-0 sudo[162923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydbjsxosbreqebbuxehddvuiwhtugqwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948990.216792-230-181612520476526/AnsiballZ_stat.py'
Nov 24 01:49:50 compute-0 sudo[162923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:50 compute-0 python3.9[162925]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:49:50 compute-0 sudo[162923]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:51 compute-0 sudo[163046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbdlgskmocvaebarwkmpycajhwilpzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948990.216792-230-181612520476526/AnsiballZ_copy.py'
Nov 24 01:49:51 compute-0 sudo[163046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:51 compute-0 python3.9[163048]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763948990.216792-230-181612520476526/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:51 compute-0 sudo[163046]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:51 compute-0 sudo[163198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttzxhspufafdrnotrxgyppomnuorigyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948991.5122154-245-236843925030260/AnsiballZ_command.py'
Nov 24 01:49:51 compute-0 sudo[163198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:52 compute-0 python3.9[163200]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:49:52 compute-0 sudo[163198]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:52 compute-0 sudo[163360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxwtwvoxwrwmvsxzurecbpsgzgkkwwbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948992.2159922-253-169423643904916/AnsiballZ_lineinfile.py'
Nov 24 01:49:52 compute-0 sudo[163360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:52 compute-0 podman[163325]: 2025-11-24 01:49:52.590558148 +0000 UTC m=+0.081492047 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 01:49:52 compute-0 python3.9[163369]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:52 compute-0 sudo[163360]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:53 compute-0 sudo[163523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmnljwhpbcxjbxeglursrjujfxdrsca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948992.974086-261-184882444725883/AnsiballZ_replace.py'
Nov 24 01:49:53 compute-0 sudo[163523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:53 compute-0 python3.9[163525]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:53 compute-0 sudo[163523]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:54 compute-0 sudo[163675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umggrijpuxxwoqfcdmvtsoiqrlywmmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948993.8843446-269-55624179073150/AnsiballZ_replace.py'
Nov 24 01:49:54 compute-0 sudo[163675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:54 compute-0 python3.9[163677]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:54 compute-0 sudo[163675]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:54 compute-0 sudo[163827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkxcdkilbhpwrsxwpyonwaktoxyfgcvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948994.651304-278-272065442534330/AnsiballZ_lineinfile.py'
Nov 24 01:49:54 compute-0 sudo[163827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:55 compute-0 python3.9[163829]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:55 compute-0 sudo[163827]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:55 compute-0 sudo[163979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsscopgsmazshwyxdxodxzaljzsjirsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948995.3706512-278-214756249123159/AnsiballZ_lineinfile.py'
Nov 24 01:49:55 compute-0 sudo[163979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:55 compute-0 python3.9[163981]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:55 compute-0 sudo[163979]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:56 compute-0 sudo[164131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opejjacwfsbsauqynnujhgiahdwzbite ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948995.9859254-278-43306815970396/AnsiballZ_lineinfile.py'
Nov 24 01:49:56 compute-0 sudo[164131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:56 compute-0 python3.9[164133]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:56 compute-0 sudo[164131]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:56 compute-0 sudo[164283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtkcttsadzkoczafecwmfuujhkllkpem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948996.6060038-278-52893732771698/AnsiballZ_lineinfile.py'
Nov 24 01:49:56 compute-0 sudo[164283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:57 compute-0 python3.9[164285]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:57 compute-0 sudo[164283]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:57 compute-0 sudo[164435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nntgpemeneiqaweciqlpecrolryvuygz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948997.3444245-307-84397660721137/AnsiballZ_stat.py'
Nov 24 01:49:57 compute-0 sudo[164435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:57 compute-0 python3.9[164437]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:49:57 compute-0 sudo[164435]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:58 compute-0 sudo[164600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombobtuvcecktwxhbryuuydivqbvqgpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948998.0700321-315-48988610514178/AnsiballZ_file.py'
Nov 24 01:49:58 compute-0 sudo[164600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:58 compute-0 podman[164563]: 2025-11-24 01:49:58.470308119 +0000 UTC m=+0.135520835 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 01:49:58 compute-0 python3.9[164609]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:49:58 compute-0 sudo[164600]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:59 compute-0 sudo[164765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehbscsulsnohrnhtymhicuvdwuzjbzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948998.861879-324-174546280465238/AnsiballZ_file.py'
Nov 24 01:49:59 compute-0 sudo[164765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:49:59 compute-0 python3.9[164767]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:49:59 compute-0 sudo[164765]: pam_unix(sudo:session): session closed for user root
Nov 24 01:49:59 compute-0 sudo[164917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjbuzbkujsgprczacnjxlnpmbhgvbtzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948999.6191-332-279750412295450/AnsiballZ_stat.py'
Nov 24 01:49:59 compute-0 sudo[164917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:00 compute-0 python3.9[164919]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:00 compute-0 sudo[164917]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:00 compute-0 sudo[164995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbqtxtwibxnjoulovpsricoczjyrraoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763948999.6191-332-279750412295450/AnsiballZ_file.py'
Nov 24 01:50:00 compute-0 sudo[164995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:00 compute-0 python3.9[164997]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:50:00 compute-0 sudo[164995]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:01 compute-0 sudo[165147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkqcfpflfumyejprbqyhgmvorwvpzxjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949000.7687213-332-116517650598327/AnsiballZ_stat.py'
Nov 24 01:50:01 compute-0 sudo[165147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:01 compute-0 python3.9[165149]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:01 compute-0 sudo[165147]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:01 compute-0 sudo[165225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxcpadchysxnthnvsuybzkbcthutimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949000.7687213-332-116517650598327/AnsiballZ_file.py'
Nov 24 01:50:01 compute-0 sudo[165225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:01 compute-0 python3.9[165227]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:50:01 compute-0 sudo[165225]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:02 compute-0 sudo[165377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slbegfpvfugfuhhfhdcryeqqfeikgzzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949001.9368355-355-63159536969734/AnsiballZ_file.py'
Nov 24 01:50:02 compute-0 sudo[165377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:02 compute-0 python3.9[165379]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:02 compute-0 sudo[165377]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:02 compute-0 sudo[165529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isayacdhgnfkplpxrdjqamnovuhvdomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949002.6143694-363-222773860531707/AnsiballZ_stat.py'
Nov 24 01:50:02 compute-0 sudo[165529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:03 compute-0 python3.9[165531]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:03 compute-0 sudo[165529]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:03 compute-0 sudo[165607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnuzxzofljqaibnoshkcggizoysiekj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949002.6143694-363-222773860531707/AnsiballZ_file.py'
Nov 24 01:50:03 compute-0 sudo[165607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:03 compute-0 python3.9[165609]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:03 compute-0 sudo[165607]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:04 compute-0 sudo[165759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzkidyriwwfwyfgjxalcorslinrrcnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949003.7867866-375-262322494987070/AnsiballZ_stat.py'
Nov 24 01:50:04 compute-0 sudo[165759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:04 compute-0 python3.9[165761]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:04 compute-0 sudo[165759]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:04 compute-0 sudo[165837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdwaebscckusnxgbuulnvtxvwlqqmdrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949003.7867866-375-262322494987070/AnsiballZ_file.py'
Nov 24 01:50:04 compute-0 sudo[165837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:04 compute-0 python3.9[165839]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:04 compute-0 sudo[165837]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:05 compute-0 sudo[165989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whtrffjozwczwytmsixmjeahaftnrqbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949004.844589-387-160875185642514/AnsiballZ_systemd.py'
Nov 24 01:50:05 compute-0 sudo[165989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:05 compute-0 python3.9[165991]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:05 compute-0 systemd[1]: Reloading.
Nov 24 01:50:05 compute-0 systemd-rc-local-generator[166014]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:05 compute-0 systemd-sysv-generator[166018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:05 compute-0 sudo[165989]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:06 compute-0 sudo[166179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smxrztbzrxgugnjnekjdruajdhledikp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949006.0015268-395-89443568644924/AnsiballZ_stat.py'
Nov 24 01:50:06 compute-0 sudo[166179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:06 compute-0 python3.9[166181]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:06 compute-0 sudo[166179]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:06 compute-0 sudo[166257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acivxpkzkznioamejnfxksvmjllmygst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949006.0015268-395-89443568644924/AnsiballZ_file.py'
Nov 24 01:50:06 compute-0 sudo[166257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:06 compute-0 python3.9[166259]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:06 compute-0 sudo[166257]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:07 compute-0 sudo[166409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnqeptwcpobacblwfpyzylrwahyukpub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949007.117352-407-103827391908706/AnsiballZ_stat.py'
Nov 24 01:50:07 compute-0 sudo[166409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:07 compute-0 python3.9[166411]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:07 compute-0 sudo[166409]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:07 compute-0 sudo[166487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxkywmwbswnypnyjadsbhdfmvtdgwzkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949007.117352-407-103827391908706/AnsiballZ_file.py'
Nov 24 01:50:07 compute-0 sudo[166487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:08 compute-0 python3.9[166489]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:08 compute-0 sudo[166487]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:08 compute-0 sudo[166639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-towcnpzhtchdxdimxzemgamflmqndjwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949008.3263059-419-135749110143834/AnsiballZ_systemd.py'
Nov 24 01:50:08 compute-0 sudo[166639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:08 compute-0 python3.9[166641]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:08 compute-0 systemd[1]: Reloading.
Nov 24 01:50:09 compute-0 systemd-rc-local-generator[166670]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:09 compute-0 systemd-sysv-generator[166675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:09 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 01:50:09 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 01:50:09 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 01:50:09 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 01:50:09 compute-0 sudo[166639]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:10 compute-0 sudo[166833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmkavcxuwpdacedpxunzzylgruwbxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949009.6732187-429-200254270046408/AnsiballZ_file.py'
Nov 24 01:50:10 compute-0 sudo[166833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:10 compute-0 python3.9[166835]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:50:10 compute-0 sudo[166833]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:10 compute-0 sudo[166985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnvztefpwualqkrnfbuaczpickmbmnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949010.4529552-437-200405178263005/AnsiballZ_stat.py'
Nov 24 01:50:10 compute-0 sudo[166985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:10 compute-0 python3.9[166987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:11 compute-0 sudo[166985]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:11 compute-0 sudo[167108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peieudowlnrumcrgmmnkawesqnnmtenb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949010.4529552-437-200405178263005/AnsiballZ_copy.py'
Nov 24 01:50:11 compute-0 sudo[167108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:11 compute-0 python3.9[167110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949010.4529552-437-200405178263005/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:50:11 compute-0 sudo[167108]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:12 compute-0 sudo[167260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigvmdsjgvcbeyqdmtmjxtbiiipbluhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949011.9849424-454-47728583311941/AnsiballZ_file.py'
Nov 24 01:50:12 compute-0 sudo[167260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:12 compute-0 python3.9[167262]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:50:12 compute-0 sudo[167260]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:13 compute-0 sudo[167412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tghxwwpffikgbfnmyjvpjedkkumftooy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949012.6676362-462-244595936731331/AnsiballZ_stat.py'
Nov 24 01:50:13 compute-0 sudo[167412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:13 compute-0 python3.9[167414]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:13 compute-0 sudo[167412]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:13 compute-0 sudo[167535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhmjvfuvmkwovsiejsgscdsgsombikdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949012.6676362-462-244595936731331/AnsiballZ_copy.py'
Nov 24 01:50:13 compute-0 sudo[167535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:13 compute-0 python3.9[167537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949012.6676362-462-244595936731331/.source.json _original_basename=.o10am6gh follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:13 compute-0 sudo[167535]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:14 compute-0 sudo[167687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btixtwedzfmbpfpixofhsrhkknvadakq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949013.9483328-477-219101833792325/AnsiballZ_file.py'
Nov 24 01:50:14 compute-0 sudo[167687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:14 compute-0 python3.9[167689]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:14 compute-0 sudo[167687]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:15 compute-0 sudo[167839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhppwgaerkpklzlvcumaojbnajmnyzbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949014.7203953-485-243570073811445/AnsiballZ_stat.py'
Nov 24 01:50:15 compute-0 sudo[167839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:15 compute-0 sudo[167839]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:15 compute-0 sudo[167962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhowrftfljuycncjyzyvujdolrsjqiqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949014.7203953-485-243570073811445/AnsiballZ_copy.py'
Nov 24 01:50:15 compute-0 sudo[167962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:15 compute-0 sudo[167962]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:16 compute-0 sudo[168114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqfcnzrdeuwjcfjvuwsonuvfvesejio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949016.0837235-502-81189537184613/AnsiballZ_container_config_data.py'
Nov 24 01:50:16 compute-0 sudo[168114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:16 compute-0 python3.9[168116]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 24 01:50:16 compute-0 sudo[168114]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:17 compute-0 sudo[168266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwezlglltxnqxvcproqboinoznljxenm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949016.995049-511-40912908580235/AnsiballZ_container_config_hash.py'
Nov 24 01:50:17 compute-0 sudo[168266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:17 compute-0 python3.9[168268]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:50:17 compute-0 sudo[168266]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:18 compute-0 sudo[168418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubhtlvgbyqzuateqvasitsxtenloblj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949017.9391296-520-113734588845480/AnsiballZ_podman_container_info.py'
Nov 24 01:50:18 compute-0 sudo[168418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:18 compute-0 python3.9[168420]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 01:50:18 compute-0 sudo[168418]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:19 compute-0 sudo[168597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agorgtzwztvwjmfnqwkmruxjecwdfztt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949019.1762333-533-79734791641277/AnsiballZ_edpm_container_manage.py'
Nov 24 01:50:19 compute-0 sudo[168597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:19 compute-0 python3[168599]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:50:20 compute-0 podman[168636]: 2025-11-24 01:50:20.141233266 +0000 UTC m=+0.053433760 container create 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 01:50:20 compute-0 podman[168636]: 2025-11-24 01:50:20.115126655 +0000 UTC m=+0.027327199 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 01:50:20 compute-0 python3[168599]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 01:50:20 compute-0 sudo[168597]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:20 compute-0 sudo[168826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mborjpsizritwrghqheyjijadrrtqrbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949020.4699721-541-237330060126295/AnsiballZ_stat.py'
Nov 24 01:50:20 compute-0 sudo[168826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:20 compute-0 python3.9[168828]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:50:20 compute-0 sudo[168826]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:21 compute-0 sudo[168980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcfperhqnlvtjdocvzqlfkloiptmqdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949021.2356176-550-41656707621069/AnsiballZ_file.py'
Nov 24 01:50:21 compute-0 sudo[168980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:21 compute-0 python3.9[168982]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:21 compute-0 sudo[168980]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:21 compute-0 sudo[169056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsddgtxccwjbsfuemzmfsnnucebkgkfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949021.2356176-550-41656707621069/AnsiballZ_stat.py'
Nov 24 01:50:21 compute-0 sudo[169056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:22 compute-0 python3.9[169058]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:50:22 compute-0 sudo[169056]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:22 compute-0 podman[169181]: 2025-11-24 01:50:22.703411953 +0000 UTC m=+0.059383872 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 24 01:50:22 compute-0 sudo[169221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzsvmbharplabimhvkgsfkwiypxuasdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949022.2261178-550-111423913407586/AnsiballZ_copy.py'
Nov 24 01:50:22 compute-0 sudo[169221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:22 compute-0 python3.9[169225]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949022.2261178-550-111423913407586/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:22 compute-0 sudo[169221]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:23 compute-0 sudo[169299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoswmgyxllhlruoaboxnpajcfxkkghgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949022.2261178-550-111423913407586/AnsiballZ_systemd.py'
Nov 24 01:50:23 compute-0 sudo[169299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:23 compute-0 python3.9[169301]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:50:23 compute-0 systemd[1]: Reloading.
Nov 24 01:50:23 compute-0 systemd-rc-local-generator[169329]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:23 compute-0 systemd-sysv-generator[169333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:23 compute-0 sudo[169299]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:24 compute-0 sudo[169410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwkjggyovpfkdvbjqhrdzetklvppherc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949022.2261178-550-111423913407586/AnsiballZ_systemd.py'
Nov 24 01:50:24 compute-0 sudo[169410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:24 compute-0 python3.9[169412]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:24 compute-0 systemd[1]: Reloading.
Nov 24 01:50:24 compute-0 systemd-sysv-generator[169445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:24 compute-0 systemd-rc-local-generator[169441]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:24 compute-0 systemd[1]: Starting multipathd container...
Nov 24 01:50:24 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:50:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d99a505952a87cf05c0f4bfb72dabcd5627981caef0214a6b0f3d469e92b87e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 01:50:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d99a505952a87cf05c0f4bfb72dabcd5627981caef0214a6b0f3d469e92b87e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 01:50:24 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.
Nov 24 01:50:24 compute-0 podman[169452]: 2025-11-24 01:50:24.879777726 +0000 UTC m=+0.125229878 container init 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 01:50:24 compute-0 multipathd[169467]: + sudo -E kolla_set_configs
Nov 24 01:50:24 compute-0 sudo[169473]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 01:50:24 compute-0 sudo[169473]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:50:24 compute-0 sudo[169473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 01:50:24 compute-0 podman[169452]: 2025-11-24 01:50:24.912056206 +0000 UTC m=+0.157508338 container start 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 24 01:50:24 compute-0 podman[169452]: multipathd
Nov 24 01:50:24 compute-0 systemd[1]: Started multipathd container.
Nov 24 01:50:24 compute-0 multipathd[169467]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:50:24 compute-0 multipathd[169467]: INFO:__main__:Validating config file
Nov 24 01:50:24 compute-0 sudo[169473]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:24 compute-0 multipathd[169467]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:50:24 compute-0 multipathd[169467]: INFO:__main__:Writing out command to execute
Nov 24 01:50:24 compute-0 multipathd[169467]: ++ cat /run_command
Nov 24 01:50:24 compute-0 multipathd[169467]: + CMD='/usr/sbin/multipathd -d'
Nov 24 01:50:24 compute-0 multipathd[169467]: + ARGS=
Nov 24 01:50:24 compute-0 multipathd[169467]: + sudo kolla_copy_cacerts
Nov 24 01:50:24 compute-0 sudo[169410]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:24 compute-0 sudo[169491]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 01:50:24 compute-0 sudo[169491]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:50:24 compute-0 sudo[169491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 01:50:24 compute-0 sudo[169491]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:24 compute-0 multipathd[169467]: + [[ ! -n '' ]]
Nov 24 01:50:24 compute-0 multipathd[169467]: + . kolla_extend_start
Nov 24 01:50:24 compute-0 multipathd[169467]: Running command: '/usr/sbin/multipathd -d'
Nov 24 01:50:24 compute-0 multipathd[169467]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 01:50:24 compute-0 multipathd[169467]: + umask 0022
Nov 24 01:50:24 compute-0 multipathd[169467]: + exec /usr/sbin/multipathd -d
Nov 24 01:50:24 compute-0 podman[169474]: 2025-11-24 01:50:24.999439162 +0000 UTC m=+0.073803796 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 01:50:25 compute-0 systemd[1]: 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-6b65b51abf27c105.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:50:25 compute-0 systemd[1]: 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-6b65b51abf27c105.service: Failed with result 'exit-code'.
Nov 24 01:50:25 compute-0 multipathd[169467]: 2491.731256 | --------start up--------
Nov 24 01:50:25 compute-0 multipathd[169467]: 2491.731271 | read /etc/multipath.conf
Nov 24 01:50:25 compute-0 multipathd[169467]: 2491.736380 | path checkers start up
Nov 24 01:50:25 compute-0 python3.9[169653]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:50:26 compute-0 sudo[169805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofuoccxyttyfsigqzrjjqjadjpxkqphl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949025.856317-586-74162265708476/AnsiballZ_command.py'
Nov 24 01:50:26 compute-0 sudo[169805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:26 compute-0 python3.9[169807]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:50:26 compute-0 sudo[169805]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:26 compute-0 sudo[169971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emexmcadslqgridsqndbyuveelmtbugv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949026.5657907-594-194516625932186/AnsiballZ_systemd.py'
Nov 24 01:50:26 compute-0 sudo[169971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:27 compute-0 python3.9[169973]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:50:27 compute-0 systemd[1]: Stopping multipathd container...
Nov 24 01:50:27 compute-0 multipathd[169467]: 2494.097868 | exit (signal)
Nov 24 01:50:27 compute-0 multipathd[169467]: 2494.098444 | --------shut down-------
Nov 24 01:50:27 compute-0 systemd[1]: libpod-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.scope: Deactivated successfully.
Nov 24 01:50:27 compute-0 podman[169978]: 2025-11-24 01:50:27.414955213 +0000 UTC m=+0.079665735 container died 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:50:27 compute-0 systemd[1]: 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-6b65b51abf27c105.timer: Deactivated successfully.
Nov 24 01:50:27 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.
Nov 24 01:50:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-userdata-shm.mount: Deactivated successfully.
Nov 24 01:50:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-9d99a505952a87cf05c0f4bfb72dabcd5627981caef0214a6b0f3d469e92b87e-merged.mount: Deactivated successfully.
Nov 24 01:50:27 compute-0 podman[169978]: 2025-11-24 01:50:27.477369561 +0000 UTC m=+0.142080093 container cleanup 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 24 01:50:27 compute-0 podman[169978]: multipathd
Nov 24 01:50:27 compute-0 podman[170007]: multipathd
Nov 24 01:50:27 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 24 01:50:27 compute-0 systemd[1]: Stopped multipathd container.
Nov 24 01:50:27 compute-0 systemd[1]: Starting multipathd container...
Nov 24 01:50:27 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:50:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d99a505952a87cf05c0f4bfb72dabcd5627981caef0214a6b0f3d469e92b87e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 01:50:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d99a505952a87cf05c0f4bfb72dabcd5627981caef0214a6b0f3d469e92b87e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 01:50:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.
Nov 24 01:50:27 compute-0 podman[170021]: 2025-11-24 01:50:27.745814733 +0000 UTC m=+0.147204571 container init 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 24 01:50:27 compute-0 multipathd[170037]: + sudo -E kolla_set_configs
Nov 24 01:50:27 compute-0 podman[170021]: 2025-11-24 01:50:27.777544397 +0000 UTC m=+0.178934155 container start 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd)
Nov 24 01:50:27 compute-0 podman[170021]: multipathd
Nov 24 01:50:27 compute-0 systemd[1]: Started multipathd container.
Nov 24 01:50:27 compute-0 sudo[170043]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 01:50:27 compute-0 sudo[170043]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:50:27 compute-0 sudo[170043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 01:50:27 compute-0 sudo[169971]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:27 compute-0 podman[170044]: 2025-11-24 01:50:27.839751598 +0000 UTC m=+0.052627916 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 01:50:27 compute-0 systemd[1]: 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-7f699d9407b3e7b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:50:27 compute-0 systemd[1]: 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6-7f699d9407b3e7b0.service: Failed with result 'exit-code'.
Nov 24 01:50:27 compute-0 multipathd[170037]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:50:27 compute-0 multipathd[170037]: INFO:__main__:Validating config file
Nov 24 01:50:27 compute-0 multipathd[170037]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:50:27 compute-0 multipathd[170037]: INFO:__main__:Writing out command to execute
Nov 24 01:50:27 compute-0 sudo[170043]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:27 compute-0 multipathd[170037]: ++ cat /run_command
Nov 24 01:50:27 compute-0 multipathd[170037]: + CMD='/usr/sbin/multipathd -d'
Nov 24 01:50:27 compute-0 multipathd[170037]: + ARGS=
Nov 24 01:50:27 compute-0 multipathd[170037]: + sudo kolla_copy_cacerts
Nov 24 01:50:27 compute-0 sudo[170090]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 01:50:27 compute-0 sudo[170090]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:50:27 compute-0 sudo[170090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 01:50:27 compute-0 sudo[170090]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:27 compute-0 multipathd[170037]: + [[ ! -n '' ]]
Nov 24 01:50:27 compute-0 multipathd[170037]: + . kolla_extend_start
Nov 24 01:50:27 compute-0 multipathd[170037]: Running command: '/usr/sbin/multipathd -d'
Nov 24 01:50:27 compute-0 multipathd[170037]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 01:50:27 compute-0 multipathd[170037]: + umask 0022
Nov 24 01:50:27 compute-0 multipathd[170037]: + exec /usr/sbin/multipathd -d
Nov 24 01:50:27 compute-0 multipathd[170037]: 2494.632674 | --------start up--------
Nov 24 01:50:27 compute-0 multipathd[170037]: 2494.632695 | read /etc/multipath.conf
Nov 24 01:50:27 compute-0 multipathd[170037]: 2494.639264 | path checkers start up
Nov 24 01:50:28 compute-0 sudo[170227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcfcwxkdgwkaixsrvbwcxgtdjlcagkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949028.0039437-602-219844042217261/AnsiballZ_file.py'
Nov 24 01:50:28 compute-0 sudo[170227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:28 compute-0 python3.9[170229]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:28 compute-0 sudo[170227]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:28 compute-0 sshd-session[170156]: Invalid user userroot from 46.188.119.26 port 34136
Nov 24 01:50:28 compute-0 podman[170254]: 2025-11-24 01:50:28.843496658 +0000 UTC m=+0.098091026 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 01:50:28 compute-0 sshd-session[170156]: Received disconnect from 46.188.119.26 port 34136:11: Bye Bye [preauth]
Nov 24 01:50:28 compute-0 sshd-session[170156]: Disconnected from invalid user userroot 46.188.119.26 port 34136 [preauth]
Nov 24 01:50:29 compute-0 sudo[170406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtlkfbvhaqcatldtcwxwqkqvfnnfdvjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949028.893242-614-274028012293098/AnsiballZ_file.py'
Nov 24 01:50:29 compute-0 sudo[170406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:29 compute-0 python3.9[170408]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 01:50:29 compute-0 sudo[170406]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:29 compute-0 sudo[170558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgfhmzjwgqlzpybaocprrvfxcxmuduuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949029.5801332-622-65670256807978/AnsiballZ_modprobe.py'
Nov 24 01:50:29 compute-0 sudo[170558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:30 compute-0 python3.9[170560]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 24 01:50:30 compute-0 kernel: Key type psk registered
Nov 24 01:50:30 compute-0 sudo[170558]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:30 compute-0 sudo[170722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkuxrakedujnsgrsreuxicexbilzenka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949030.3228421-630-178705985482979/AnsiballZ_stat.py'
Nov 24 01:50:30 compute-0 sudo[170722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:30 compute-0 python3.9[170724]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:50:30 compute-0 sudo[170722]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:31 compute-0 sudo[170845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeezxonltjfeuzuwdzwcjjitppgljkzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949030.3228421-630-178705985482979/AnsiballZ_copy.py'
Nov 24 01:50:31 compute-0 sudo[170845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:31 compute-0 python3.9[170847]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949030.3228421-630-178705985482979/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:31 compute-0 sudo[170845]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:31 compute-0 sudo[170997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkwzyfshzxhfufehgnznofnnlvaqnrpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949031.705098-646-259828241463810/AnsiballZ_lineinfile.py'
Nov 24 01:50:31 compute-0 sudo[170997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:32 compute-0 python3.9[170999]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:32 compute-0 sudo[170997]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:32 compute-0 sudo[171149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtooouhnksnuxmxxjwubxxoquasoyhsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949032.3624604-654-100448501839619/AnsiballZ_systemd.py'
Nov 24 01:50:32 compute-0 sudo[171149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:32 compute-0 python3.9[171151]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:50:32 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 01:50:32 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 24 01:50:32 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 24 01:50:33 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 01:50:33 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 01:50:33 compute-0 sudo[171149]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:33 compute-0 sudo[171305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbmevqahnusouhlpcxfeenodgsvwjzye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949033.2357292-662-278711220224116/AnsiballZ_dnf.py'
Nov 24 01:50:33 compute-0 sudo[171305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:33 compute-0 python3.9[171307]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 01:50:36 compute-0 systemd[1]: Reloading.
Nov 24 01:50:36 compute-0 systemd-sysv-generator[171345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:36 compute-0 systemd-rc-local-generator[171342]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:36 compute-0 systemd[1]: Reloading.
Nov 24 01:50:36 compute-0 systemd-rc-local-generator[171372]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:36 compute-0 systemd-sysv-generator[171377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:36 compute-0 systemd-logind[791]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 01:50:36 compute-0 systemd-logind[791]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 01:50:37 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 01:50:37 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 01:50:37 compute-0 systemd[1]: Reloading.
Nov 24 01:50:37 compute-0 systemd-rc-local-generator[171469]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:37 compute-0 systemd-sysv-generator[171473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:37 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 01:50:37 compute-0 sudo[171305]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:38 compute-0 sudo[172757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iudkvftovxcrdbisgxwonysdttxnhcfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949038.0547569-670-48082205443037/AnsiballZ_systemd_service.py'
Nov 24 01:50:38 compute-0 sudo[172757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:38 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 24 01:50:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 01:50:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 01:50:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.613s CPU time.
Nov 24 01:50:38 compute-0 systemd[1]: run-rbe4c2b55c47845a1a9d96cd1d2320761.service: Deactivated successfully.
Nov 24 01:50:38 compute-0 python3.9[172759]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:50:39 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 01:50:39 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 24 01:50:39 compute-0 iscsid[161128]: iscsid shutting down.
Nov 24 01:50:39 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 24 01:50:39 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 24 01:50:39 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 01:50:39 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 24 01:50:39 compute-0 systemd[1]: Started Open-iSCSI.
Nov 24 01:50:39 compute-0 sudo[172757]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:40 compute-0 python3.9[172917]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:50:40 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 24 01:50:41 compute-0 sudo[173072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jswznthhkfyyppjjplkluxxjxvfrokxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949040.9930978-688-147235021182635/AnsiballZ_file.py'
Nov 24 01:50:41 compute-0 sudo[173072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:41 compute-0 python3.9[173074]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:41 compute-0 sudo[173072]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:41 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 24 01:50:42 compute-0 sudo[173225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imcdszheclrkdwjmzynrsehejptajzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949041.8753493-699-272264328211020/AnsiballZ_systemd_service.py'
Nov 24 01:50:42 compute-0 sudo[173225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:42 compute-0 python3.9[173227]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:50:42 compute-0 systemd[1]: Reloading.
Nov 24 01:50:42 compute-0 systemd-rc-local-generator[173254]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:50:42 compute-0 systemd-sysv-generator[173257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:50:42 compute-0 sudo[173225]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:43 compute-0 python3.9[173411]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:50:43 compute-0 network[173428]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:50:43 compute-0 network[173429]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:50:43 compute-0 network[173430]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:50:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:50:48.408 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:50:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:50:48.409 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:50:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:50:48.410 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:50:49 compute-0 sudo[173702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvgrmamdhdjxagcfdfjmqdyjbxakasww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949049.2331276-718-13736895183857/AnsiballZ_systemd_service.py'
Nov 24 01:50:49 compute-0 sudo[173702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:49 compute-0 python3.9[173704]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:49 compute-0 sudo[173702]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:50 compute-0 sudo[173855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmbtjtogdcvgsplucwkrezzkwfifeghh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949050.0424755-718-146489138283167/AnsiballZ_systemd_service.py'
Nov 24 01:50:50 compute-0 sudo[173855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:50 compute-0 python3.9[173857]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:50 compute-0 sudo[173855]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:51 compute-0 sudo[174008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bekhxjaocfluizkdoutxfnkvsyaxgsps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949050.8710692-718-143796279622922/AnsiballZ_systemd_service.py'
Nov 24 01:50:51 compute-0 sudo[174008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:51 compute-0 python3.9[174010]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:51 compute-0 sudo[174008]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:51 compute-0 sudo[174161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mziqedwcwrfchnltovuzavyjmeqwplld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949051.6506639-718-186732771298602/AnsiballZ_systemd_service.py'
Nov 24 01:50:52 compute-0 sudo[174161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:52 compute-0 python3.9[174163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:52 compute-0 sudo[174161]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:52 compute-0 podman[174264]: 2025-11-24 01:50:52.811063195 +0000 UTC m=+0.060709860 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 24 01:50:52 compute-0 sudo[174333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfntkjghsblivjpfhagtbxelmznxzuxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949052.5130358-718-213043864694295/AnsiballZ_systemd_service.py'
Nov 24 01:50:52 compute-0 sudo[174333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:53 compute-0 python3.9[174335]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:53 compute-0 sudo[174333]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:53 compute-0 sudo[174486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpoxfyebsosrwytkoamolncfemmfzezc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949053.3927972-718-41194770460979/AnsiballZ_systemd_service.py'
Nov 24 01:50:53 compute-0 sudo[174486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:54 compute-0 python3.9[174488]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:54 compute-0 sudo[174486]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:54 compute-0 sudo[174639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxkvulvbnuhdllycxkipnkwwbkdjzafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949054.2623384-718-82605685411326/AnsiballZ_systemd_service.py'
Nov 24 01:50:54 compute-0 sudo[174639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:54 compute-0 python3.9[174641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:54 compute-0 sudo[174639]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:55 compute-0 sudo[174792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyxxqdubvlkzfzowepnlaazjafigzyzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949055.0801787-718-146215335674832/AnsiballZ_systemd_service.py'
Nov 24 01:50:55 compute-0 sudo[174792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:55 compute-0 python3.9[174794]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:50:55 compute-0 sudo[174792]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:56 compute-0 sudo[174945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtrfxxknlklvkvmydgckfewpiwvvaitp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949056.226468-777-175796336652566/AnsiballZ_file.py'
Nov 24 01:50:56 compute-0 sudo[174945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:56 compute-0 python3.9[174947]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:56 compute-0 sudo[174945]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:57 compute-0 sudo[175097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iicieyoosfstvnhihqshqzyacvzgfaxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949056.8959486-777-239016269181759/AnsiballZ_file.py'
Nov 24 01:50:57 compute-0 sudo[175097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:57 compute-0 python3.9[175099]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:57 compute-0 sudo[175097]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:57 compute-0 sudo[175249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gapbjyksqcpefhduyvfuszrxfinpmiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949057.532934-777-63811512376129/AnsiballZ_file.py'
Nov 24 01:50:57 compute-0 sudo[175249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:58 compute-0 podman[175251]: 2025-11-24 01:50:58.026538641 +0000 UTC m=+0.084135624 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 24 01:50:58 compute-0 python3.9[175252]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:58 compute-0 sudo[175249]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:58 compute-0 sudo[175423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwmezrvnodxclaoygazcenqzvhqmhiuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949058.3175817-777-185221116515209/AnsiballZ_file.py'
Nov 24 01:50:58 compute-0 sudo[175423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:58 compute-0 python3.9[175425]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:58 compute-0 sudo[175423]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:59 compute-0 sudo[175591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqdlgfuzvqveynmokhzkfglomobxleb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949059.0177186-777-244178061315213/AnsiballZ_file.py'
Nov 24 01:50:59 compute-0 sudo[175591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:50:59 compute-0 podman[175549]: 2025-11-24 01:50:59.346104257 +0000 UTC m=+0.081771026 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 01:50:59 compute-0 python3.9[175598]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:50:59 compute-0 sudo[175591]: pam_unix(sudo:session): session closed for user root
Nov 24 01:50:59 compute-0 sudo[175753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrxqzwwpbxujmdsmmtvtxamcallhzdfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949059.6886706-777-36038235731183/AnsiballZ_file.py'
Nov 24 01:50:59 compute-0 sudo[175753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:00 compute-0 python3.9[175755]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:00 compute-0 sudo[175753]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:00 compute-0 sudo[175905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxqwetarfokwvomvbyleguoqffwhklwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949060.2960637-777-157575568596190/AnsiballZ_file.py'
Nov 24 01:51:00 compute-0 sudo[175905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:00 compute-0 python3.9[175907]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:00 compute-0 sudo[175905]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:01 compute-0 sudo[176057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjixfdtqfhdlotlrxxwmxylrbwqeivp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949060.9702392-777-18026413104476/AnsiballZ_file.py'
Nov 24 01:51:01 compute-0 sudo[176057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:01 compute-0 python3.9[176059]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:01 compute-0 sudo[176057]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:01 compute-0 sudo[176209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfafqukdjbeveamjgrjdausobvlyvdmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949061.6815817-834-179211908501175/AnsiballZ_file.py'
Nov 24 01:51:01 compute-0 sudo[176209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:02 compute-0 python3.9[176211]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:02 compute-0 sudo[176209]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:02 compute-0 sudo[176361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwamahempmcfszrtlpdmvpssknadifpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949062.3400657-834-126452063050375/AnsiballZ_file.py'
Nov 24 01:51:02 compute-0 sudo[176361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:02 compute-0 python3.9[176363]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:02 compute-0 sudo[176361]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:03 compute-0 sudo[176513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnqrdeaoptnucdggciorhjqculpxshvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949062.9613233-834-76614642947612/AnsiballZ_file.py'
Nov 24 01:51:03 compute-0 sudo[176513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:03 compute-0 python3.9[176515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:03 compute-0 sudo[176513]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:03 compute-0 sudo[176665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawxyypsobccvdxjlvrwmgarwcdsfzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949063.636988-834-99630530849141/AnsiballZ_file.py'
Nov 24 01:51:03 compute-0 sudo[176665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:04 compute-0 python3.9[176667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:04 compute-0 sudo[176665]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:04 compute-0 sudo[176817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwoxfgsoveylzmpqetynwbnqnykqumeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949064.330498-834-215409309078583/AnsiballZ_file.py'
Nov 24 01:51:04 compute-0 sudo[176817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:04 compute-0 python3.9[176819]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:04 compute-0 sudo[176817]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:05 compute-0 sudo[176969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhjjykrtzvnqmjjrremzpqwmmucrbbfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949065.0400102-834-216863199047157/AnsiballZ_file.py'
Nov 24 01:51:05 compute-0 sudo[176969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:05 compute-0 python3.9[176971]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:05 compute-0 sudo[176969]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:06 compute-0 sudo[177121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhudclfqtrzznjyjhuhflqjgpqzhskpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949065.733596-834-161937289080088/AnsiballZ_file.py'
Nov 24 01:51:06 compute-0 sudo[177121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:06 compute-0 python3.9[177123]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:06 compute-0 sudo[177121]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:06 compute-0 sudo[177273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arwbmysdlglzruzfiggydyqhssyguvbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949066.4047077-834-103093916620111/AnsiballZ_file.py'
Nov 24 01:51:06 compute-0 sudo[177273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:06 compute-0 python3.9[177275]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:06 compute-0 sudo[177273]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:07 compute-0 sudo[177425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpmlchgddrnqnnlyryrffovddiyakeum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949067.2478435-892-243282616270405/AnsiballZ_command.py'
Nov 24 01:51:07 compute-0 sudo[177425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:07 compute-0 python3.9[177427]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:07 compute-0 sudo[177425]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:08 compute-0 python3.9[177579]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:51:09 compute-0 sudo[177729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mteebholqpbtwjplarvqsrdpkxswoyyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949069.087634-910-144231003626218/AnsiballZ_systemd_service.py'
Nov 24 01:51:09 compute-0 sudo[177729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:09 compute-0 python3.9[177731]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:51:09 compute-0 systemd[1]: Reloading.
Nov 24 01:51:09 compute-0 systemd-sysv-generator[177763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:51:09 compute-0 systemd-rc-local-generator[177760]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:51:09 compute-0 sudo[177729]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:10 compute-0 sudo[177917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bthbnkzupkfwampjavkzfkljokxzxzas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949070.1124654-918-11058055914712/AnsiballZ_command.py'
Nov 24 01:51:10 compute-0 sudo[177917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:10 compute-0 python3.9[177919]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:10 compute-0 sudo[177917]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:11 compute-0 sudo[178070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dotdtqpjlmdphczdhglcnybusbxwpkfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949070.782482-918-95073436953740/AnsiballZ_command.py'
Nov 24 01:51:11 compute-0 sudo[178070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:11 compute-0 python3.9[178072]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:11 compute-0 sudo[178070]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:11 compute-0 sudo[178223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbefgsbiamychfiytxubmlicruzvluca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949071.4480212-918-197048683060036/AnsiballZ_command.py'
Nov 24 01:51:11 compute-0 sudo[178223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:11 compute-0 python3.9[178225]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:12 compute-0 sudo[178223]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:12 compute-0 sudo[178376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipazpjgyzsvxvqccbtzrfwkruayhejbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949072.1459603-918-76878472142837/AnsiballZ_command.py'
Nov 24 01:51:12 compute-0 sudo[178376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:12 compute-0 python3.9[178378]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:12 compute-0 sudo[178376]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:13 compute-0 sudo[178529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rufdwhcaehhvrfypudduvaghystdhilk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949072.8485975-918-47998663967790/AnsiballZ_command.py'
Nov 24 01:51:13 compute-0 sudo[178529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:13 compute-0 python3.9[178531]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:13 compute-0 sudo[178529]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:13 compute-0 sudo[178682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ordtobuwqgwxcezkmssonqxohelvdaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949073.5599582-918-49567691500931/AnsiballZ_command.py'
Nov 24 01:51:13 compute-0 sudo[178682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:13 compute-0 python3.9[178684]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:13 compute-0 sudo[178682]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:14 compute-0 sudo[178835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytdrkzaotaixdaqujuybnwjzgwikktr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949074.1434872-918-140419757610310/AnsiballZ_command.py'
Nov 24 01:51:14 compute-0 sudo[178835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:14 compute-0 python3.9[178837]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:14 compute-0 sudo[178835]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:15 compute-0 sudo[178988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewoddbzlfkhrbipyfmmqmdxhfjtflnhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949074.7866225-918-230177559424594/AnsiballZ_command.py'
Nov 24 01:51:15 compute-0 sudo[178988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:15 compute-0 python3.9[178990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:51:15 compute-0 sudo[178988]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:16 compute-0 sudo[179141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrduvbvssdxaauscuqlfcouxpiszidff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949076.400397-997-21178016211300/AnsiballZ_file.py'
Nov 24 01:51:16 compute-0 sudo[179141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:16 compute-0 python3.9[179143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:16 compute-0 sudo[179141]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:17 compute-0 sudo[179293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzqdbsppffuqtemzzojmomtkvdqidxem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949077.1270258-997-59437368721030/AnsiballZ_file.py'
Nov 24 01:51:17 compute-0 sudo[179293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:17 compute-0 python3.9[179295]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:17 compute-0 sudo[179293]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:18 compute-0 sudo[179445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ramvmpfgcqymvyctqwpiztfafatiipvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949077.8169942-997-76417248044031/AnsiballZ_file.py'
Nov 24 01:51:18 compute-0 sudo[179445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:18 compute-0 python3.9[179447]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:18 compute-0 sudo[179445]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:18 compute-0 sudo[179599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cprgdotaugmabfxposxgzvzokmlhcide ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949078.6447945-1019-71733190913383/AnsiballZ_file.py'
Nov 24 01:51:18 compute-0 sudo[179599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:19 compute-0 python3.9[179601]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:19 compute-0 sudo[179599]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:19 compute-0 sudo[179751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfjbebtwzfwcmdoymhvxaupljnvfxlpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949079.3551455-1019-228269565482144/AnsiballZ_file.py'
Nov 24 01:51:19 compute-0 sudo[179751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:19 compute-0 python3.9[179753]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:19 compute-0 sudo[179751]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:20 compute-0 sudo[179903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciijxufugzhvtxltobfffuybxkwqwauj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949080.035455-1019-132475186725778/AnsiballZ_file.py'
Nov 24 01:51:20 compute-0 sudo[179903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:20 compute-0 python3.9[179905]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:20 compute-0 sudo[179903]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:21 compute-0 sudo[180055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqbluxzaohsfqtopbjnjutbvqptmyuts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949080.758336-1019-141810469992438/AnsiballZ_file.py'
Nov 24 01:51:21 compute-0 sudo[180055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:21 compute-0 python3.9[180057]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:21 compute-0 sudo[180055]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:21 compute-0 sudo[180207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdqizzforzsdumcagjysmwhjfzugtxrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949081.4270203-1019-246431766753563/AnsiballZ_file.py'
Nov 24 01:51:21 compute-0 sudo[180207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:21 compute-0 python3.9[180209]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:21 compute-0 sudo[180207]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:22 compute-0 sudo[180359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkapwglylqcqhvewtaqldkbqakefihrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949082.0857298-1019-120017185665795/AnsiballZ_file.py'
Nov 24 01:51:22 compute-0 sudo[180359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:22 compute-0 python3.9[180361]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:22 compute-0 sudo[180359]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:23 compute-0 sudo[180521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uobtsiyyhbohlrixvdpndywcaipqpvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949082.7554455-1019-124384943092590/AnsiballZ_file.py'
Nov 24 01:51:23 compute-0 sudo[180521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:23 compute-0 podman[180485]: 2025-11-24 01:51:23.113118141 +0000 UTC m=+0.082989727 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:51:23 compute-0 python3.9[180526]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:23 compute-0 sudo[180521]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:26 compute-0 sshd-session[179548]: Connection closed by 180.76.115.202 port 37940 [preauth]
Nov 24 01:51:27 compute-0 sudo[180682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojawtmqbajjtpydpvdvlrecdpmxfeobj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949087.3004615-1188-264569642673959/AnsiballZ_getent.py'
Nov 24 01:51:27 compute-0 sudo[180682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:27 compute-0 python3.9[180684]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 24 01:51:28 compute-0 sudo[180682]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:28 compute-0 sudo[180849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgvsnuokuhugdkwwqtaagujcscahhrro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949088.1929905-1196-87851379387226/AnsiballZ_group.py'
Nov 24 01:51:28 compute-0 sudo[180849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:28 compute-0 podman[180809]: 2025-11-24 01:51:28.675916847 +0000 UTC m=+0.068415214 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 01:51:28 compute-0 python3.9[180857]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:51:28 compute-0 groupadd[180858]: group added to /etc/group: name=nova, GID=42436
Nov 24 01:51:28 compute-0 groupadd[180858]: group added to /etc/gshadow: name=nova
Nov 24 01:51:28 compute-0 groupadd[180858]: new group: name=nova, GID=42436
Nov 24 01:51:28 compute-0 sudo[180849]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:29 compute-0 sudo[181028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsvizsctwvrrlevwnuiqixkeaaqjmomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949089.1299973-1204-25952268948860/AnsiballZ_user.py'
Nov 24 01:51:29 compute-0 sudo[181028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:29 compute-0 podman[180987]: 2025-11-24 01:51:29.740715518 +0000 UTC m=+0.136651933 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 01:51:29 compute-0 python3.9[181035]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 01:51:30 compute-0 useradd[181043]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 24 01:51:30 compute-0 useradd[181043]: add 'nova' to group 'libvirt'
Nov 24 01:51:30 compute-0 useradd[181043]: add 'nova' to shadow group 'libvirt'
Nov 24 01:51:30 compute-0 sudo[181028]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:31 compute-0 sshd-session[181074]: Accepted publickey for zuul from 192.168.122.30 port 58150 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:51:31 compute-0 systemd-logind[791]: New session 24 of user zuul.
Nov 24 01:51:31 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 24 01:51:31 compute-0 sshd-session[181074]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:51:31 compute-0 sshd-session[181077]: Received disconnect from 192.168.122.30 port 58150:11: disconnected by user
Nov 24 01:51:31 compute-0 sshd-session[181077]: Disconnected from user zuul 192.168.122.30 port 58150
Nov 24 01:51:31 compute-0 sshd-session[181074]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:51:31 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 24 01:51:31 compute-0 systemd-logind[791]: Session 24 logged out. Waiting for processes to exit.
Nov 24 01:51:31 compute-0 systemd-logind[791]: Removed session 24.
Nov 24 01:51:31 compute-0 python3.9[181227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:32 compute-0 python3.9[181348]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949091.4814162-1229-274053424751387/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:33 compute-0 python3.9[181498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:33 compute-0 python3.9[181574]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:34 compute-0 python3.9[181724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:34 compute-0 python3.9[181845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949093.7758296-1229-236690783147934/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:35 compute-0 python3.9[181995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:36 compute-0 python3.9[182116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949094.9593763-1229-197206359702741/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:36 compute-0 python3.9[182266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:37 compute-0 python3.9[182387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949096.2988057-1229-42879908101310/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:38 compute-0 python3.9[182537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:38 compute-0 python3.9[182658]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949097.4945214-1229-152763435290701/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:39 compute-0 sudo[182808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxbwzahegjugphxdacobmwqmqvhilfyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949098.8476744-1312-128151199821397/AnsiballZ_file.py'
Nov 24 01:51:39 compute-0 sudo[182808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:39 compute-0 python3.9[182810]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:39 compute-0 sudo[182808]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:39 compute-0 sudo[182960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttzropsetkpeeexrguhnxizrorljipzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949099.5412965-1320-199474584913956/AnsiballZ_copy.py'
Nov 24 01:51:39 compute-0 sudo[182960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:39 compute-0 python3.9[182962]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:40 compute-0 sudo[182960]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:40 compute-0 sudo[183112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwjkhnjwddqwogpiltfkzcdddwjvvug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949100.1606681-1328-207911964601889/AnsiballZ_stat.py'
Nov 24 01:51:40 compute-0 sudo[183112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:40 compute-0 python3.9[183114]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:40 compute-0 sudo[183112]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:41 compute-0 sudo[183264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxiotjassaabhwqgsalhheoeodelbmzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949100.8188686-1336-60315212651786/AnsiballZ_stat.py'
Nov 24 01:51:41 compute-0 sudo[183264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:41 compute-0 python3.9[183266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:41 compute-0 sudo[183264]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:41 compute-0 sudo[183387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqeokfahdsjmauebfktbygsaddroopy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949100.8188686-1336-60315212651786/AnsiballZ_copy.py'
Nov 24 01:51:41 compute-0 sudo[183387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:41 compute-0 python3.9[183389]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763949100.8188686-1336-60315212651786/.source _original_basename=.c634qaqc follow=False checksum=f734460c470161066681a4b81837d89354e6288d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 24 01:51:41 compute-0 sudo[183387]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:42 compute-0 python3.9[183541]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:43 compute-0 python3.9[183695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:43 compute-0 sshd-session[183591]: Invalid user ubuntu from 46.188.119.26 port 34468
Nov 24 01:51:43 compute-0 sshd-session[183591]: Received disconnect from 46.188.119.26 port 34468:11: Bye Bye [preauth]
Nov 24 01:51:43 compute-0 sshd-session[183591]: Disconnected from invalid user ubuntu 46.188.119.26 port 34468 [preauth]
Nov 24 01:51:43 compute-0 python3.9[183816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949102.9270353-1362-226316585730351/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:44 compute-0 python3.9[183966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:51:45 compute-0 python3.9[184087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949104.138535-1377-6472514140694/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:51:45 compute-0 sudo[184237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmqqyxlsniqivmpocozfzckoufqbrmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949105.3342018-1394-147479084303960/AnsiballZ_container_config_data.py'
Nov 24 01:51:45 compute-0 sudo[184237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:45 compute-0 python3.9[184239]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 24 01:51:45 compute-0 sudo[184237]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:46 compute-0 sudo[184389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhqskklcoboolbnebcbjijktajkpsknx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949106.1459305-1403-41321622502523/AnsiballZ_container_config_hash.py'
Nov 24 01:51:46 compute-0 sudo[184389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:46 compute-0 python3.9[184391]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:51:46 compute-0 sudo[184389]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:47 compute-0 sudo[184541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyhumfwneyuguxtczbubydhloixhrns ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949106.9678833-1413-81290562651134/AnsiballZ_edpm_container_manage.py'
Nov 24 01:51:47 compute-0 sudo[184541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:47 compute-0 python3[184543]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:51:47 compute-0 podman[184576]: 2025-11-24 01:51:47.898272877 +0000 UTC m=+0.078454968 container create e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 01:51:47 compute-0 podman[184576]: 2025-11-24 01:51:47.859239384 +0000 UTC m=+0.039421565 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 01:51:47 compute-0 python3[184543]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 24 01:51:48 compute-0 sudo[184541]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:51:48.409 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:51:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:51:48.410 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:51:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:51:48.410 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:51:48 compute-0 sudo[184764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlpfxfkemilpbiicbfepizvoyvzzwicj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949108.2005968-1421-99095800193707/AnsiballZ_stat.py'
Nov 24 01:51:48 compute-0 sudo[184764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:48 compute-0 python3.9[184766]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:48 compute-0 sudo[184764]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:49 compute-0 sudo[184918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tghcwjlsgdrgejcsakvejnsvflypfrdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949109.1020913-1433-280259579129760/AnsiballZ_container_config_data.py'
Nov 24 01:51:49 compute-0 sudo[184918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:49 compute-0 python3.9[184920]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 24 01:51:49 compute-0 sudo[184918]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:50 compute-0 sudo[185070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqckajbsffjvwxdexafvwumcmpapzwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949109.8756542-1442-211629335228621/AnsiballZ_container_config_hash.py'
Nov 24 01:51:50 compute-0 sudo[185070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:50 compute-0 python3.9[185072]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:51:50 compute-0 sudo[185070]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:50 compute-0 sudo[185222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvgwxpayfkmenmwezfftyfucvoowtpb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949110.6351593-1452-9852415331982/AnsiballZ_edpm_container_manage.py'
Nov 24 01:51:50 compute-0 sudo[185222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:51 compute-0 python3[185224]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:51:51 compute-0 podman[185262]: 2025-11-24 01:51:51.417114465 +0000 UTC m=+0.050871159 container create 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.build-date=20251118)
Nov 24 01:51:51 compute-0 podman[185262]: 2025-11-24 01:51:51.390251566 +0000 UTC m=+0.024008280 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 01:51:51 compute-0 python3[185224]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 24 01:51:51 compute-0 sudo[185222]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:52 compute-0 sudo[185451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-polvrqxzzvwudszfwopzwmdvbvssxmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949111.7481287-1460-196528456974140/AnsiballZ_stat.py'
Nov 24 01:51:52 compute-0 sudo[185451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:52 compute-0 python3.9[185453]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:52 compute-0 sudo[185451]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:52 compute-0 sudo[185605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymefatqfqdkdebzjaimgpnwmteijawrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949112.58269-1469-272304831884792/AnsiballZ_file.py'
Nov 24 01:51:52 compute-0 sudo[185605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:53 compute-0 python3.9[185607]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:53 compute-0 sudo[185605]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:53 compute-0 sudo[185766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrdlpaxiwjrqzfrhwkgpayuyvnasgubc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949113.1804862-1469-23557464795072/AnsiballZ_copy.py'
Nov 24 01:51:53 compute-0 sudo[185766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:53 compute-0 podman[185730]: 2025-11-24 01:51:53.63821034 +0000 UTC m=+0.076893084 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:51:53 compute-0 python3.9[185776]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949113.1804862-1469-23557464795072/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:51:53 compute-0 sudo[185766]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:54 compute-0 sudo[185851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwlnnxfwhxjmgqxigiezxyurncuwjzce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949113.1804862-1469-23557464795072/AnsiballZ_systemd.py'
Nov 24 01:51:54 compute-0 sudo[185851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:54 compute-0 python3.9[185853]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:51:54 compute-0 systemd[1]: Reloading.
Nov 24 01:51:54 compute-0 systemd-rc-local-generator[185871]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:51:54 compute-0 systemd-sysv-generator[185878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:51:54 compute-0 sudo[185851]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:55 compute-0 sudo[185961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atgtmsuiqajaqhzpwrdnfmjocigyzcny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949113.1804862-1469-23557464795072/AnsiballZ_systemd.py'
Nov 24 01:51:55 compute-0 sudo[185961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:55 compute-0 python3.9[185963]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:51:55 compute-0 systemd[1]: Reloading.
Nov 24 01:51:55 compute-0 systemd-sysv-generator[185998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:51:55 compute-0 systemd-rc-local-generator[185994]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:51:55 compute-0 systemd[1]: Starting nova_compute container...
Nov 24 01:51:55 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:51:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 01:51:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 01:51:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 01:51:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 01:51:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 01:51:55 compute-0 podman[186003]: 2025-11-24 01:51:55.908455424 +0000 UTC m=+0.119796206 container init 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm)
Nov 24 01:51:55 compute-0 podman[186003]: 2025-11-24 01:51:55.920841054 +0000 UTC m=+0.132181806 container start 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:51:55 compute-0 podman[186003]: nova_compute
Nov 24 01:51:55 compute-0 nova_compute[186018]: + sudo -E kolla_set_configs
Nov 24 01:51:55 compute-0 systemd[1]: Started nova_compute container.
Nov 24 01:51:55 compute-0 sudo[185961]: pam_unix(sudo:session): session closed for user root
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Validating config file
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying service configuration files
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Deleting /etc/ceph
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Creating directory /etc/ceph
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Writing out command to execute
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:51:56 compute-0 nova_compute[186018]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 01:51:56 compute-0 nova_compute[186018]: ++ cat /run_command
Nov 24 01:51:56 compute-0 nova_compute[186018]: + CMD=nova-compute
Nov 24 01:51:56 compute-0 nova_compute[186018]: + ARGS=
Nov 24 01:51:56 compute-0 nova_compute[186018]: + sudo kolla_copy_cacerts
Nov 24 01:51:56 compute-0 nova_compute[186018]: + [[ ! -n '' ]]
Nov 24 01:51:56 compute-0 nova_compute[186018]: + . kolla_extend_start
Nov 24 01:51:56 compute-0 nova_compute[186018]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 01:51:56 compute-0 nova_compute[186018]: Running command: 'nova-compute'
Nov 24 01:51:56 compute-0 nova_compute[186018]: + umask 0022
Nov 24 01:51:56 compute-0 nova_compute[186018]: + exec nova-compute
Nov 24 01:51:57 compute-0 python3.9[186180]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:57 compute-0 python3.9[186330]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.035 186022 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.035 186022 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.035 186022 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.036 186022 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.182 186022 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.204 186022 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.204 186022 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 01:51:58 compute-0 python3.9[186486]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.743 186022 INFO nova.virt.driver [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 01:51:58 compute-0 podman[186519]: 2025-11-24 01:51:58.805136981 +0000 UTC m=+0.061575981 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible)
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.843 186022 INFO nova.compute.provider_config [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.857 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.858 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.858 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.859 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.859 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.859 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.859 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.859 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.860 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.861 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.861 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.861 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.861 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.861 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.862 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.863 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.864 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.865 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.866 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.866 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.866 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.866 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.866 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.867 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.868 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.869 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.870 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.871 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.872 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.873 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.874 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.875 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.876 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.877 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.878 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.879 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.880 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.881 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.882 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.882 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.882 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.882 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.882 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.883 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.884 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.885 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.886 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.887 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.888 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.889 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.890 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.891 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.892 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.893 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.894 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.895 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.896 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.897 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.898 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.898 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.898 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.898 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.898 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.899 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.900 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.901 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.902 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.903 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.904 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.905 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.906 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.907 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.908 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.909 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.910 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.911 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.912 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.912 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.912 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.912 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.913 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.914 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.915 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.916 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.917 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.918 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.919 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.920 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.921 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.922 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.923 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.924 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.925 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.926 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.927 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.928 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.929 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.930 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.931 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.932 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.932 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.932 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.932 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.932 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.933 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.933 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.933 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.933 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.933 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.934 186022 WARNING oslo_config.cfg [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 01:51:58 compute-0 nova_compute[186018]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 01:51:58 compute-0 nova_compute[186018]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 01:51:58 compute-0 nova_compute[186018]: and ``live_migration_inbound_addr`` respectively.
Nov 24 01:51:58 compute-0 nova_compute[186018]: ).  Its value may be silently ignored in the future.
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.934 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.934 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.934 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.934 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.935 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.936 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.937 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.938 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.938 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.938 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.938 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.938 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.939 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.940 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.941 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.942 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.943 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.944 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.945 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.946 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.947 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.948 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.948 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.948 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.948 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.948 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.949 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.950 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.951 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.952 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.953 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.954 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.955 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.955 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.955 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.955 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.955 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.956 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.956 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.956 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.956 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.957 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.957 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.957 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.957 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.957 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.958 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.959 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.960 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.961 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.962 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.963 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.964 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.965 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.966 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.967 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.968 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.969 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.970 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.971 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.972 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.973 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.973 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.973 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.973 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.973 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.974 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.975 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.976 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.977 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.978 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.979 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.980 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.980 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.980 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.980 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.980 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.981 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.982 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.982 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.982 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.982 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.982 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.983 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.984 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.985 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.986 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.987 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.988 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.989 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.990 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.991 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.992 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:58 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.993 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.994 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.995 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.996 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.997 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.998 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:58.999 186022 DEBUG oslo_service.service [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.001 186022 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.021 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.023 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.023 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.023 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 01:51:59 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 01:51:59 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.100 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f580baf09d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.103 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f580baf09d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.104 186022 INFO nova.virt.libvirt.driver [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Connection event '1' reason 'None'
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.125 186022 WARNING nova.virt.libvirt.driver [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 01:51:59 compute-0 nova_compute[186018]: 2025-11-24 01:51:59.125 186022 DEBUG nova.virt.libvirt.volume.mount [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 01:51:59 compute-0 sudo[186706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-comjeifahwonxodljkhyhohgxtpekico ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949118.7510247-1529-257382735732696/AnsiballZ_podman_container.py'
Nov 24 01:51:59 compute-0 sudo[186706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:51:59 compute-0 sshd-session[186371]: Invalid user sftp from 154.90.59.75 port 39746
Nov 24 01:51:59 compute-0 python3.9[186708]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 01:51:59 compute-0 sshd-session[186371]: Received disconnect from 154.90.59.75 port 39746:11: Bye Bye [preauth]
Nov 24 01:51:59 compute-0 sshd-session[186371]: Disconnected from invalid user sftp 154.90.59.75 port 39746 [preauth]
Nov 24 01:51:59 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:51:59 compute-0 sudo[186706]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.000 186022 INFO nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]: 
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <host>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <uuid>a7051ccc-fa00-488d-9432-c0e2d4ac9648</uuid>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <arch>x86_64</arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model>EPYC-Rome-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <vendor>AMD</vendor>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <microcode version='16777317'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <signature family='23' model='49' stepping='0'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='x2apic'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='tsc-deadline'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='osxsave'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='hypervisor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='tsc_adjust'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='spec-ctrl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='stibp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='arch-capabilities'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='cmp_legacy'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='topoext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='virt-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='lbrv'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='tsc-scale'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='vmcb-clean'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='pause-filter'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='pfthreshold'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='svme-addr-chk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='rdctl-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='mds-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature name='pschange-mc-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <pages unit='KiB' size='4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <pages unit='KiB' size='2048'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <pages unit='KiB' size='1048576'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <power_management>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <suspend_mem/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <suspend_disk/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <suspend_hybrid/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </power_management>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <iommu support='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <migration_features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <live/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <uri_transports>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <uri_transport>tcp</uri_transport>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <uri_transport>rdma</uri_transport>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </uri_transports>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </migration_features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <topology>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <cells num='1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <cell id='0'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <memory unit='KiB'>7864320</memory>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <distances>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <sibling id='0' value='10'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           </distances>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           <cpus num='8'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:           </cpus>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         </cell>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </cells>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </topology>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <cache>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </cache>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <secmodel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model>selinux</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <doi>0</doi>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </secmodel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <secmodel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model>dac</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <doi>0</doi>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </secmodel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </host>
Nov 24 01:52:00 compute-0 nova_compute[186018]: 
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <guest>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <os_type>hvm</os_type>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <arch name='i686'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <wordsize>32</wordsize>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <domain type='qemu'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <domain type='kvm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <pae/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <nonpae/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <acpi default='on' toggle='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <apic default='on' toggle='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <cpuselection/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <deviceboot/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <disksnapshot default='on' toggle='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <externalSnapshot/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </guest>
Nov 24 01:52:00 compute-0 nova_compute[186018]: 
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <guest>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <os_type>hvm</os_type>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <arch name='x86_64'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <wordsize>64</wordsize>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <domain type='qemu'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <domain type='kvm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <acpi default='on' toggle='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <apic default='on' toggle='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <cpuselection/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <deviceboot/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <disksnapshot default='on' toggle='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <externalSnapshot/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </guest>
Nov 24 01:52:00 compute-0 nova_compute[186018]: 
Nov 24 01:52:00 compute-0 nova_compute[186018]: </capabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]: 
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.010 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.039 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 01:52:00 compute-0 nova_compute[186018]: <domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <domain>kvm</domain>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <arch>i686</arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <vcpu max='240'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <iothreads supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <os supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='firmware'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <loader supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>rom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pflash</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='readonly'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>yes</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='secure'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </loader>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </os>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='maximumMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <vendor>AMD</vendor>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='succor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='custom' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-128'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-256'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-512'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <memoryBacking supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='sourceType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>anonymous</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>memfd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </memoryBacking>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <disk supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='diskDevice'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>disk</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cdrom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>floppy</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>lun</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ide</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>fdc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>sata</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </disk>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <graphics supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vnc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egl-headless</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </graphics>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <video supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='modelType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vga</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cirrus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>none</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>bochs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ramfb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </video>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hostdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='mode'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>subsystem</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='startupPolicy'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>mandatory</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>requisite</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>optional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='subsysType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pci</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='capsType'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='pciBackend'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hostdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <rng supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>random</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </rng>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <filesystem supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='driverType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>path</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>handle</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtiofs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </filesystem>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <tpm supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-tis</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-crb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emulator</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>external</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendVersion'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>2.0</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </tpm>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <redirdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </redirdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <channel supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </channel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <crypto supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </crypto>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <interface supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>passt</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </interface>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <panic supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>isa</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>hyperv</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </panic>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <console supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>null</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dev</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pipe</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stdio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>udp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tcp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu-vdagent</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </console>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <gic supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <genid supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backup supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <async-teardown supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <ps2 supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sev supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sgx supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hyperv supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='features'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>relaxed</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vapic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>spinlocks</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vpindex</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>runtime</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>synic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stimer</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reset</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vendor_id</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>frequencies</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reenlightenment</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tlbflush</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ipi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>avic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emsr_bitmap</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>xmm_input</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hyperv>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <launchSecurity supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='sectype'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tdx</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </launchSecurity>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]: </domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.045 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 01:52:00 compute-0 nova_compute[186018]: <domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <domain>kvm</domain>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <arch>i686</arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <vcpu max='4096'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <iothreads supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <os supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='firmware'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <loader supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>rom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pflash</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='readonly'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>yes</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='secure'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </loader>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </os>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='maximumMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <vendor>AMD</vendor>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='succor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='custom' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-128'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-256'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-512'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 podman[186838]: 2025-11-24 01:52:00.122089936 +0000 UTC m=+0.093796852 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 sudo[186916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bufhzgvudlianrkidzynavqcnvwlyryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949119.8378685-1537-263469965409228/AnsiballZ_systemd.py'
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 sudo[186916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <memoryBacking supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='sourceType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>anonymous</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>memfd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </memoryBacking>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <disk supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='diskDevice'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>disk</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cdrom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>floppy</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>lun</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>fdc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>sata</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </disk>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <graphics supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vnc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egl-headless</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </graphics>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <video supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='modelType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vga</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cirrus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>none</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>bochs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ramfb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </video>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hostdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='mode'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>subsystem</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='startupPolicy'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>mandatory</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>requisite</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>optional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='subsysType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pci</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='capsType'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='pciBackend'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hostdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <rng supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>random</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </rng>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <filesystem supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='driverType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>path</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>handle</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtiofs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </filesystem>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <tpm supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-tis</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-crb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emulator</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>external</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendVersion'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>2.0</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </tpm>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <redirdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </redirdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <channel supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </channel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <crypto supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </crypto>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <interface supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>passt</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </interface>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <panic supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>isa</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>hyperv</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </panic>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <console supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>null</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dev</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pipe</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stdio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>udp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tcp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu-vdagent</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </console>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <gic supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <genid supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backup supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <async-teardown supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <ps2 supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sev supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sgx supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hyperv supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='features'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>relaxed</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vapic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>spinlocks</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vpindex</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>runtime</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>synic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stimer</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reset</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vendor_id</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>frequencies</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reenlightenment</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tlbflush</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ipi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>avic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emsr_bitmap</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>xmm_input</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hyperv>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <launchSecurity supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='sectype'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tdx</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </launchSecurity>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]: </domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.088 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.094 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 01:52:00 compute-0 nova_compute[186018]: <domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <domain>kvm</domain>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <arch>x86_64</arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <vcpu max='240'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <iothreads supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <os supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='firmware'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <loader supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>rom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pflash</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='readonly'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>yes</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='secure'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </loader>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </os>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='maximumMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <vendor>AMD</vendor>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='succor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='custom' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-128'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-256'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-512'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <memoryBacking supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='sourceType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>anonymous</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>memfd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </memoryBacking>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <disk supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='diskDevice'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>disk</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cdrom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>floppy</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>lun</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ide</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>fdc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>sata</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </disk>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <graphics supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vnc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egl-headless</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </graphics>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <video supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='modelType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vga</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cirrus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>none</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>bochs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ramfb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </video>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hostdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='mode'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>subsystem</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='startupPolicy'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>mandatory</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>requisite</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>optional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='subsysType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pci</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='capsType'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='pciBackend'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hostdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <rng supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>random</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </rng>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <filesystem supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='driverType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>path</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>handle</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtiofs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </filesystem>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <tpm supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-tis</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-crb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emulator</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>external</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendVersion'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>2.0</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </tpm>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <redirdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </redirdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <channel supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </channel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <crypto supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </crypto>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <interface supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>passt</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </interface>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <panic supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>isa</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>hyperv</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </panic>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <console supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>null</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dev</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pipe</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stdio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>udp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tcp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu-vdagent</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </console>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <gic supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <genid supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backup supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <async-teardown supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <ps2 supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sev supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sgx supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hyperv supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='features'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>relaxed</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vapic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>spinlocks</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vpindex</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>runtime</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>synic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stimer</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reset</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vendor_id</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>frequencies</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reenlightenment</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tlbflush</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ipi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>avic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emsr_bitmap</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>xmm_input</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hyperv>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <launchSecurity supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='sectype'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tdx</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </launchSecurity>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]: </domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.150 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 01:52:00 compute-0 nova_compute[186018]: <domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <domain>kvm</domain>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <arch>x86_64</arch>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <vcpu max='4096'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <iothreads supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <os supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='firmware'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>efi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <loader supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>rom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pflash</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='readonly'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>yes</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='secure'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>yes</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>no</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </loader>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </os>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='maximumMigratable'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>on</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>off</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <vendor>AMD</vendor>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='succor'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <mode name='custom' supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Denverton-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='auto-ibrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amd-psfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='stibp-always-on'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='EPYC-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-128'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-256'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx10-512'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='prefetchiti'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Haswell-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512er'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512pf'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fma4'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tbm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xop'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='amx-tile'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-bf16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-fp16'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bitalg'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrc'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fzrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='la57'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='taa-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xfd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ifma'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cmpccxadd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fbsdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='fsrs'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ibrs-all'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mcdt-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pbrsb-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='psdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='serialize'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vaes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='hle'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='rtm'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512bw'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512cd'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512dq'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512f'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='avx512vl'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='invpcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pcid'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='pku'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='mpx'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='core-capability'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='split-lock-detect'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='cldemote'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='erms'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='gfni'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdir64b'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='movdiri'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='xsaves'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='athlon-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='core2duo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='coreduo-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='n270-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='ss'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <blockers model='phenom-v1'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnow'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <feature name='3dnowext'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </blockers>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </mode>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </cpu>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <memoryBacking supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <enum name='sourceType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>anonymous</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <value>memfd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </memoryBacking>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <disk supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='diskDevice'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>disk</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cdrom</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>floppy</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>lun</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>fdc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>sata</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </disk>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <graphics supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vnc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egl-headless</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </graphics>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <video supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='modelType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vga</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>cirrus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>none</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>bochs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ramfb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </video>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hostdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='mode'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>subsystem</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='startupPolicy'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>mandatory</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>requisite</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>optional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='subsysType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pci</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>scsi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='capsType'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='pciBackend'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hostdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <rng supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtio-non-transitional</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>random</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>egd</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </rng>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <filesystem supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='driverType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>path</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>handle</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>virtiofs</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </filesystem>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <tpm supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-tis</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tpm-crb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emulator</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>external</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendVersion'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>2.0</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </tpm>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <redirdev supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='bus'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>usb</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </redirdev>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <channel supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </channel>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <crypto supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendModel'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>builtin</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </crypto>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <interface supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='backendType'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>default</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>passt</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </interface>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <panic supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='model'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>isa</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>hyperv</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </panic>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <console supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='type'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>null</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vc</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pty</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dev</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>file</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>pipe</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stdio</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>udp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tcp</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>unix</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>qemu-vdagent</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>dbus</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </console>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </devices>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   <features>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <gic supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <genid supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <backup supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <async-teardown supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <ps2 supported='yes'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sev supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <sgx supported='no'/>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <hyperv supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='features'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>relaxed</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vapic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>spinlocks</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vpindex</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>runtime</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>synic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>stimer</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reset</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>vendor_id</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>frequencies</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>reenlightenment</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tlbflush</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>ipi</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>avic</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>emsr_bitmap</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>xmm_input</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </defaults>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </hyperv>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     <launchSecurity supported='yes'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       <enum name='sectype'>
Nov 24 01:52:00 compute-0 nova_compute[186018]:         <value>tdx</value>
Nov 24 01:52:00 compute-0 nova_compute[186018]:       </enum>
Nov 24 01:52:00 compute-0 nova_compute[186018]:     </launchSecurity>
Nov 24 01:52:00 compute-0 nova_compute[186018]:   </features>
Nov 24 01:52:00 compute-0 nova_compute[186018]: </domainCapabilities>
Nov 24 01:52:00 compute-0 nova_compute[186018]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.209 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.209 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.210 186022 DEBUG nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.210 186022 INFO nova.virt.libvirt.host [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Secure Boot support detected
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.212 186022 INFO nova.virt.libvirt.driver [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.212 186022 INFO nova.virt.libvirt.driver [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.220 186022 DEBUG nova.virt.libvirt.driver [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.269 186022 INFO nova.virt.node [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Determined node identity f28f14d1-2972-450a-b67e-0899e7918234 from /var/lib/nova/compute_id
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.286 186022 WARNING nova.compute.manager [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Compute nodes ['f28f14d1-2972-450a-b67e-0899e7918234'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.317 186022 INFO nova.compute.manager [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.360 186022 WARNING nova.compute.manager [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.360 186022 DEBUG oslo_concurrency.lockutils [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.361 186022 DEBUG oslo_concurrency.lockutils [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.361 186022 DEBUG oslo_concurrency.lockutils [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.361 186022 DEBUG nova.compute.resource_tracker [None req-5fe078ba-c757-4405-b167-0b0257dc3e07 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:52:00 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 01:52:00 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 24 01:52:00 compute-0 python3.9[186918]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:52:00 compute-0 systemd[1]: Stopping nova_compute container...
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.560 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.561 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:52:00 compute-0 nova_compute[186018]: 2025-11-24 01:52:00.561 186022 DEBUG oslo_concurrency.lockutils [None req-87fe19df-cee5-46fc-a80d-fc13b23978ee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:52:01 compute-0 virtqemud[186602]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 24 01:52:01 compute-0 virtqemud[186602]: hostname: compute-0
Nov 24 01:52:01 compute-0 virtqemud[186602]: End of file while reading data: Input/output error
Nov 24 01:52:01 compute-0 systemd[1]: libpod-7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67.scope: Deactivated successfully.
Nov 24 01:52:01 compute-0 systemd[1]: libpod-7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67.scope: Consumed 3.382s CPU time.
Nov 24 01:52:01 compute-0 podman[186943]: 2025-11-24 01:52:01.246091199 +0000 UTC m=+0.739287293 container died 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute)
Nov 24 01:52:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67-userdata-shm.mount: Deactivated successfully.
Nov 24 01:52:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d-merged.mount: Deactivated successfully.
Nov 24 01:52:01 compute-0 podman[186943]: 2025-11-24 01:52:01.317294961 +0000 UTC m=+0.810491055 container cleanup 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:52:01 compute-0 podman[186943]: nova_compute
Nov 24 01:52:01 compute-0 podman[186971]: nova_compute
Nov 24 01:52:01 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 24 01:52:01 compute-0 systemd[1]: Stopped nova_compute container.
Nov 24 01:52:01 compute-0 systemd[1]: Starting nova_compute container...
Nov 24 01:52:01 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:52:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/529d39bef52bd6da4086d5ae19d45295dae2451096d7203f515ac305ab5fa77d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:01 compute-0 podman[186984]: 2025-11-24 01:52:01.533175392 +0000 UTC m=+0.092082814 container init 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 01:52:01 compute-0 podman[186984]: 2025-11-24 01:52:01.546803287 +0000 UTC m=+0.105710719 container start 7e29a31a6dba12703f194c0ef769135167fe0e2bf8283ffb95dad74976665f67 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 01:52:01 compute-0 podman[186984]: nova_compute
Nov 24 01:52:01 compute-0 nova_compute[186999]: + sudo -E kolla_set_configs
Nov 24 01:52:01 compute-0 systemd[1]: Started nova_compute container.
Nov 24 01:52:01 compute-0 sudo[186916]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Validating config file
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying service configuration files
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /etc/ceph
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Creating directory /etc/ceph
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Writing out command to execute
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:52:01 compute-0 nova_compute[186999]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 01:52:01 compute-0 nova_compute[186999]: ++ cat /run_command
Nov 24 01:52:01 compute-0 nova_compute[186999]: + CMD=nova-compute
Nov 24 01:52:01 compute-0 nova_compute[186999]: + ARGS=
Nov 24 01:52:01 compute-0 nova_compute[186999]: + sudo kolla_copy_cacerts
Nov 24 01:52:01 compute-0 nova_compute[186999]: + [[ ! -n '' ]]
Nov 24 01:52:01 compute-0 nova_compute[186999]: + . kolla_extend_start
Nov 24 01:52:01 compute-0 nova_compute[186999]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 01:52:01 compute-0 nova_compute[186999]: Running command: 'nova-compute'
Nov 24 01:52:01 compute-0 nova_compute[186999]: + umask 0022
Nov 24 01:52:01 compute-0 nova_compute[186999]: + exec nova-compute
Nov 24 01:52:02 compute-0 sudo[187160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trarndgbwgyilzivrtjfpajbvynefqzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949121.8253798-1546-228778481231732/AnsiballZ_podman_container.py'
Nov 24 01:52:02 compute-0 sudo[187160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:02 compute-0 python3.9[187162]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 01:52:02 compute-0 systemd[1]: Started libpod-conmon-e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870.scope.
Nov 24 01:52:02 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:52:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefce1e063ba4ee6b3082baeae0d8cd650f75a3ec56be92fdc8214d39c81cddf/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefce1e063ba4ee6b3082baeae0d8cd650f75a3ec56be92fdc8214d39c81cddf/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefce1e063ba4ee6b3082baeae0d8cd650f75a3ec56be92fdc8214d39c81cddf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 01:52:02 compute-0 podman[187187]: 2025-11-24 01:52:02.66160936 +0000 UTC m=+0.175013477 container init e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 24 01:52:02 compute-0 podman[187187]: 2025-11-24 01:52:02.669963416 +0000 UTC m=+0.183367483 container start e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:52:02 compute-0 python3.9[187162]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Applying nova statedir ownership
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 24 01:52:02 compute-0 nova_compute_init[187208]: INFO:nova_statedir:Nova statedir ownership complete
Nov 24 01:52:02 compute-0 systemd[1]: libpod-e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870.scope: Deactivated successfully.
Nov 24 01:52:02 compute-0 podman[187209]: 2025-11-24 01:52:02.750061189 +0000 UTC m=+0.042471161 container died e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 24 01:52:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870-userdata-shm.mount: Deactivated successfully.
Nov 24 01:52:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-eefce1e063ba4ee6b3082baeae0d8cd650f75a3ec56be92fdc8214d39c81cddf-merged.mount: Deactivated successfully.
Nov 24 01:52:02 compute-0 podman[187219]: 2025-11-24 01:52:02.826493629 +0000 UTC m=+0.081700619 container cleanup e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 01:52:02 compute-0 systemd[1]: libpod-conmon-e405b95f803a1b4431782d29ccc26d469306406bb3c564cc043c5bd4e3c10870.scope: Deactivated successfully.
Nov 24 01:52:02 compute-0 sudo[187160]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:03 compute-0 sshd-session[158864]: Connection closed by 192.168.122.30 port 37096
Nov 24 01:52:03 compute-0 sshd-session[158861]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:52:03 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 24 01:52:03 compute-0 systemd[1]: session-23.scope: Consumed 2min 1.680s CPU time.
Nov 24 01:52:03 compute-0 systemd-logind[791]: Session 23 logged out. Waiting for processes to exit.
Nov 24 01:52:03 compute-0 systemd-logind[791]: Removed session 23.
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.591 187003 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.592 187003 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.592 187003 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.592 187003 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.735 187003 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.761 187003 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:52:03 compute-0 nova_compute[186999]: 2025-11-24 01:52:03.761 187003 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.161 187003 INFO nova.virt.driver [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.269 187003 INFO nova.compute.provider_config [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.278 187003 DEBUG oslo_concurrency.lockutils [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.279 187003 DEBUG oslo_concurrency.lockutils [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.279 187003 DEBUG oslo_concurrency.lockutils [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.279 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.279 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.280 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.280 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.280 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.280 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.280 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.281 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.282 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.282 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.282 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.282 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.282 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.283 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.284 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.285 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.286 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.286 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.286 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.287 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.287 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.287 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.287 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.287 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.288 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.289 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.290 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.291 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.292 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.293 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.294 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.300 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.300 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.300 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.301 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.302 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.303 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.304 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.305 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.306 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.307 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.308 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.309 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.310 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.310 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.310 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.310 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.310 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.311 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.312 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.312 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.312 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.312 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.312 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.313 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.313 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.313 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.313 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.313 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.314 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.314 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.314 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.314 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.314 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.315 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.316 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.317 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.318 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.319 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.320 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.321 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.322 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.323 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.324 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.325 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.326 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.327 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.328 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.329 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.330 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.331 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.332 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.333 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.334 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.335 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.336 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.337 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.338 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.339 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.339 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.339 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.339 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.339 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.340 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.341 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.342 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.343 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.343 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.343 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.343 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.343 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.344 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.345 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.345 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.345 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.345 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.345 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.346 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.347 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.348 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.349 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.350 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.351 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.351 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.351 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.351 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.351 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.352 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.352 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.352 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.352 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.352 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.353 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.354 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.355 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.356 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.357 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.358 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.359 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.360 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.360 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.360 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.360 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.360 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.361 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.361 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.361 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.361 187003 WARNING oslo_config.cfg [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 01:52:04 compute-0 nova_compute[186999]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 01:52:04 compute-0 nova_compute[186999]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 01:52:04 compute-0 nova_compute[186999]: and ``live_migration_inbound_addr`` respectively.
Nov 24 01:52:04 compute-0 nova_compute[186999]: ).  Its value may be silently ignored in the future.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.362 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.363 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.364 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.365 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.366 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.366 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.366 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.366 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.366 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.367 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.368 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.369 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.370 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.371 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.372 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.373 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.374 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.375 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.375 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.375 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.375 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.375 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.376 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.377 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.378 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.379 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.380 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.381 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.382 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.382 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.382 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.382 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.382 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.383 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.384 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.385 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.385 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.385 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.385 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.385 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.386 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.387 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.388 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.389 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.389 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.389 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.389 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.389 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.390 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.391 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.392 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.392 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.392 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.392 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.392 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.393 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.394 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.395 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.396 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.397 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.398 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.399 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.400 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.400 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.400 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.400 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.400 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.401 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.402 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.403 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.404 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.405 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.406 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.407 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.408 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.409 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.410 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.411 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.412 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.413 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.414 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.415 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.416 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.417 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.418 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.419 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.420 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.421 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.422 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.423 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.424 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.425 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.426 187003 DEBUG oslo_service.service [None req-3dfe3e38-d16e-4ce4-b8e8-20d06f2d45ad - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.427 187003 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.438 187003 INFO nova.virt.node [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Determined node identity f28f14d1-2972-450a-b67e-0899e7918234 from /var/lib/nova/compute_id
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.439 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.440 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.440 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.440 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.451 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7efec208d100> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.453 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7efec208d100> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.454 187003 INFO nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Connection event '1' reason 'None'
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.460 187003 INFO nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]: 
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <host>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <uuid>a7051ccc-fa00-488d-9432-c0e2d4ac9648</uuid>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <arch>x86_64</arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model>EPYC-Rome-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <vendor>AMD</vendor>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <microcode version='16777317'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <signature family='23' model='49' stepping='0'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='x2apic'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='tsc-deadline'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='osxsave'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='hypervisor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='tsc_adjust'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='spec-ctrl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='stibp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='arch-capabilities'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='cmp_legacy'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='topoext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='virt-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='lbrv'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='tsc-scale'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='vmcb-clean'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='pause-filter'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='pfthreshold'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='svme-addr-chk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='rdctl-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='mds-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature name='pschange-mc-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <pages unit='KiB' size='4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <pages unit='KiB' size='2048'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <pages unit='KiB' size='1048576'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <power_management>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <suspend_mem/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <suspend_disk/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <suspend_hybrid/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </power_management>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <iommu support='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <migration_features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <live/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <uri_transports>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <uri_transport>tcp</uri_transport>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <uri_transport>rdma</uri_transport>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </uri_transports>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </migration_features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <topology>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <cells num='1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <cell id='0'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <memory unit='KiB'>7864320</memory>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <distances>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <sibling id='0' value='10'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           </distances>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           <cpus num='8'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:           </cpus>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         </cell>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </cells>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </topology>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <cache>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </cache>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <secmodel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model>selinux</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <doi>0</doi>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </secmodel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <secmodel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model>dac</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <doi>0</doi>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </secmodel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </host>
Nov 24 01:52:04 compute-0 nova_compute[186999]: 
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <guest>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <os_type>hvm</os_type>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <arch name='i686'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <wordsize>32</wordsize>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <domain type='qemu'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <domain type='kvm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <pae/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <nonpae/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <acpi default='on' toggle='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <apic default='on' toggle='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <cpuselection/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <deviceboot/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <disksnapshot default='on' toggle='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <externalSnapshot/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </guest>
Nov 24 01:52:04 compute-0 nova_compute[186999]: 
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <guest>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <os_type>hvm</os_type>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <arch name='x86_64'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <wordsize>64</wordsize>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <domain type='qemu'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <domain type='kvm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <acpi default='on' toggle='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <apic default='on' toggle='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <cpuselection/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <deviceboot/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <disksnapshot default='on' toggle='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <externalSnapshot/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </guest>
Nov 24 01:52:04 compute-0 nova_compute[186999]: 
Nov 24 01:52:04 compute-0 nova_compute[186999]: </capabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]: 
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.466 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.468 187003 DEBUG nova.virt.libvirt.volume.mount [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.469 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 01:52:04 compute-0 nova_compute[186999]: <domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <domain>kvm</domain>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <arch>i686</arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <vcpu max='4096'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <iothreads supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <os supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='firmware'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <loader supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>rom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pflash</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='readonly'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>yes</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='secure'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </loader>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </os>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='maximumMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <vendor>AMD</vendor>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='succor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='custom' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-128'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-256'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-512'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <memoryBacking supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='sourceType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>anonymous</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>memfd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </memoryBacking>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <disk supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='diskDevice'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>disk</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cdrom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>floppy</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>lun</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>fdc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>sata</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <graphics supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vnc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egl-headless</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <video supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='modelType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vga</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cirrus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>none</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>bochs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ramfb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </video>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hostdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='mode'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>subsystem</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='startupPolicy'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>mandatory</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>requisite</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>optional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='subsysType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pci</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='capsType'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='pciBackend'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hostdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <rng supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>random</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <filesystem supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='driverType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>path</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>handle</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtiofs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </filesystem>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <tpm supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-tis</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-crb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emulator</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>external</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendVersion'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>2.0</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </tpm>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <redirdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </redirdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <channel supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </channel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <crypto supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </crypto>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <interface supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>passt</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <panic supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>isa</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>hyperv</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </panic>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <console supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>null</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dev</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pipe</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stdio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>udp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tcp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu-vdagent</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </console>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <gic supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <genid supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backup supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <async-teardown supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <ps2 supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sev supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sgx supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hyperv supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='features'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>relaxed</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vapic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>spinlocks</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vpindex</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>runtime</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>synic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stimer</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reset</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vendor_id</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>frequencies</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reenlightenment</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tlbflush</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ipi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>avic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emsr_bitmap</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>xmm_input</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hyperv>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <launchSecurity supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='sectype'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tdx</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </launchSecurity>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]: </domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.476 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 01:52:04 compute-0 nova_compute[186999]: <domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <domain>kvm</domain>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <arch>i686</arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <vcpu max='240'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <iothreads supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <os supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='firmware'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <loader supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>rom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pflash</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='readonly'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>yes</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='secure'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </loader>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </os>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='maximumMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <vendor>AMD</vendor>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='succor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='custom' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-128'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-256'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-512'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <memoryBacking supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='sourceType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>anonymous</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>memfd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </memoryBacking>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <disk supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='diskDevice'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>disk</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cdrom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>floppy</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>lun</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ide</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>fdc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>sata</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <graphics supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vnc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egl-headless</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <video supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='modelType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vga</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cirrus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>none</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>bochs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ramfb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </video>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hostdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='mode'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>subsystem</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='startupPolicy'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>mandatory</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>requisite</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>optional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='subsysType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pci</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='capsType'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='pciBackend'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hostdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <rng supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>random</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <filesystem supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='driverType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>path</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>handle</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtiofs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </filesystem>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <tpm supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-tis</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-crb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emulator</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>external</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendVersion'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>2.0</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </tpm>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <redirdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </redirdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <channel supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </channel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <crypto supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </crypto>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <interface supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>passt</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <panic supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>isa</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>hyperv</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </panic>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <console supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>null</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dev</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pipe</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stdio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>udp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tcp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu-vdagent</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </console>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <gic supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <genid supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backup supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <async-teardown supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <ps2 supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sev supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sgx supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hyperv supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='features'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>relaxed</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vapic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>spinlocks</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vpindex</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>runtime</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>synic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stimer</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reset</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vendor_id</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>frequencies</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reenlightenment</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tlbflush</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ipi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>avic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emsr_bitmap</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>xmm_input</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hyperv>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <launchSecurity supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='sectype'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tdx</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </launchSecurity>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]: </domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.502 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.506 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 01:52:04 compute-0 nova_compute[186999]: <domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <domain>kvm</domain>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <arch>x86_64</arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <vcpu max='4096'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <iothreads supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <os supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='firmware'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>efi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <loader supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>rom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pflash</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='readonly'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>yes</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='secure'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>yes</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </loader>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </os>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='maximumMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <vendor>AMD</vendor>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='succor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='custom' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-128'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-256'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-512'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <memoryBacking supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='sourceType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>anonymous</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>memfd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </memoryBacking>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <disk supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='diskDevice'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>disk</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cdrom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>floppy</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>lun</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>fdc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>sata</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <graphics supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vnc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egl-headless</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <video supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='modelType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vga</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cirrus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>none</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>bochs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ramfb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </video>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hostdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='mode'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>subsystem</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='startupPolicy'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>mandatory</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>requisite</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>optional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='subsysType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pci</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='capsType'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='pciBackend'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hostdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <rng supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>random</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <filesystem supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='driverType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>path</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>handle</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtiofs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </filesystem>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <tpm supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-tis</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-crb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emulator</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>external</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendVersion'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>2.0</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </tpm>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <redirdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </redirdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <channel supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </channel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <crypto supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </crypto>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <interface supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>passt</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <panic supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>isa</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>hyperv</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </panic>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <console supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>null</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dev</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pipe</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stdio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>udp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tcp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu-vdagent</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </console>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <gic supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <genid supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backup supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <async-teardown supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <ps2 supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sev supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sgx supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hyperv supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='features'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>relaxed</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vapic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>spinlocks</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vpindex</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>runtime</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>synic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stimer</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reset</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vendor_id</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>frequencies</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reenlightenment</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tlbflush</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ipi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>avic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emsr_bitmap</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>xmm_input</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hyperv>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <launchSecurity supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='sectype'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tdx</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </launchSecurity>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]: </domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.572 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 01:52:04 compute-0 nova_compute[186999]: <domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <domain>kvm</domain>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <arch>x86_64</arch>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <vcpu max='240'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <iothreads supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <os supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='firmware'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <loader supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>rom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pflash</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='readonly'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>yes</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='secure'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>no</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </loader>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </os>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-passthrough' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='hostPassthroughMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='maximum' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='maximumMigratable'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>on</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>off</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='host-model' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <vendor>AMD</vendor>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='x2apic'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='hypervisor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='stibp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='overflow-recov'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='succor'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lbrv'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='tsc-scale'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='flushbyasid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pause-filter'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='pfthreshold'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <feature policy='disable' name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <mode name='custom' supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Broadwell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Cooperlake-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Denverton-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Dhyana-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='auto-ibrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Milan-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amd-psfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='no-nested-data-bp'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='null-sel-clr-base'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='stibp-always-on'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-Rome-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='EPYC-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='GraniteRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-128'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-256'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx10-512'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='prefetchiti'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Haswell-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v6'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Icelake-Server-v7'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='IvyBridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='KnightsMill-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4fmaps'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-4vnniw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512er'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512pf'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G4-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Opteron_G5-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fma4'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tbm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xop'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SapphireRapids-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='amx-tile'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-bf16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-fp16'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512-vpopcntdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bitalg'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vbmi2'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrc'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fzrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='la57'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='taa-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='tsx-ldtrk'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xfd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='SierraForest-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ifma'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-ne-convert'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx-vnni-int8'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='bus-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cmpccxadd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fbsdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='fsrs'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ibrs-all'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mcdt-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pbrsb-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='psdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='sbdr-ssdp-no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='serialize'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vaes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='vpclmulqdq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Client-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='hle'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='rtm'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Skylake-Server-v5'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512bw'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512cd'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512dq'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512f'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='avx512vl'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='invpcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pcid'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='pku'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='mpx'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v2'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v3'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='core-capability'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='split-lock-detect'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='Snowridge-v4'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='cldemote'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='erms'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='gfni'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdir64b'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='movdiri'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='xsaves'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='athlon-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='core2duo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='coreduo-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='n270-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='ss'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <blockers model='phenom-v1'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnow'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <feature name='3dnowext'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </blockers>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </mode>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <memoryBacking supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <enum name='sourceType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>anonymous</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <value>memfd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </memoryBacking>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <disk supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='diskDevice'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>disk</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cdrom</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>floppy</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>lun</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ide</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>fdc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>sata</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <graphics supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vnc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egl-headless</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <video supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='modelType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vga</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>cirrus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>none</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>bochs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ramfb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </video>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hostdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='mode'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>subsystem</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='startupPolicy'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>mandatory</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>requisite</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>optional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='subsysType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pci</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>scsi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='capsType'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='pciBackend'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hostdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <rng supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtio-non-transitional</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>random</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>egd</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <filesystem supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='driverType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>path</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>handle</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>virtiofs</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </filesystem>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <tpm supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-tis</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tpm-crb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emulator</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>external</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendVersion'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>2.0</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </tpm>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <redirdev supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='bus'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>usb</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </redirdev>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <channel supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </channel>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <crypto supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendModel'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>builtin</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </crypto>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <interface supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='backendType'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>default</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>passt</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <panic supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='model'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>isa</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>hyperv</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </panic>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <console supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='type'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>null</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vc</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pty</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dev</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>file</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>pipe</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stdio</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>udp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tcp</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>unix</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>qemu-vdagent</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>dbus</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </console>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   <features>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <gic supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <vmcoreinfo supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <genid supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backingStoreInput supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <backup supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <async-teardown supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <ps2 supported='yes'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sev supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <sgx supported='no'/>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <hyperv supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='features'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>relaxed</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vapic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>spinlocks</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vpindex</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>runtime</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>synic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>stimer</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reset</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>vendor_id</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>frequencies</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>reenlightenment</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tlbflush</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>ipi</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>avic</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>emsr_bitmap</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>xmm_input</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <spinlocks>4095</spinlocks>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <stimer_direct>on</stimer_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </defaults>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </hyperv>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     <launchSecurity supported='yes'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       <enum name='sectype'>
Nov 24 01:52:04 compute-0 nova_compute[186999]:         <value>tdx</value>
Nov 24 01:52:04 compute-0 nova_compute[186999]:       </enum>
Nov 24 01:52:04 compute-0 nova_compute[186999]:     </launchSecurity>
Nov 24 01:52:04 compute-0 nova_compute[186999]:   </features>
Nov 24 01:52:04 compute-0 nova_compute[186999]: </domainCapabilities>
Nov 24 01:52:04 compute-0 nova_compute[186999]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.645 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.645 187003 INFO nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Secure Boot support detected
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.647 187003 INFO nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.648 187003 INFO nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.656 187003 DEBUG nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.680 187003 INFO nova.virt.node [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Determined node identity f28f14d1-2972-450a-b67e-0899e7918234 from /var/lib/nova/compute_id
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.694 187003 WARNING nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Compute nodes ['f28f14d1-2972-450a-b67e-0899e7918234'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.733 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.749 187003 WARNING nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.749 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.750 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.750 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.750 187003 DEBUG nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.893 187003 WARNING nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.894 187003 DEBUG nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6182MB free_disk=73.66816329956055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.894 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.894 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.909 187003 WARNING nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] No compute node record for compute-0.ctlplane.example.com:f28f14d1-2972-450a-b67e-0899e7918234: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host f28f14d1-2972-450a-b67e-0899e7918234 could not be found.
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.924 187003 INFO nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: f28f14d1-2972-450a-b67e-0899e7918234
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.981 187003 DEBUG nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:52:04 compute-0 nova_compute[186999]: 2025-11-24 01:52:04.981 187003 DEBUG nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.034 187003 INFO nova.scheduler.client.report [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [req-befc792a-211b-4b4a-8816-87e0af0fd4db] Created resource provider record via placement API for resource provider with UUID f28f14d1-2972-450a-b67e-0899e7918234 and name compute-0.ctlplane.example.com.
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.438 187003 DEBUG nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 24 01:52:06 compute-0 nova_compute[186999]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.439 187003 INFO nova.virt.libvirt.host [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] kernel doesn't support AMD SEV
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.440 187003 DEBUG nova.compute.provider_tree [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.440 187003 DEBUG nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.507 187003 DEBUG nova.scheduler.client.report [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Updated inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.507 187003 DEBUG nova.compute.provider_tree [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Updating resource provider f28f14d1-2972-450a-b67e-0899e7918234 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.507 187003 DEBUG nova.compute.provider_tree [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.610 187003 DEBUG nova.compute.provider_tree [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Updating resource provider f28f14d1-2972-450a-b67e-0899e7918234 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.632 187003 DEBUG nova.compute.resource_tracker [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.633 187003 DEBUG oslo_concurrency.lockutils [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.633 187003 DEBUG nova.service [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.692 187003 DEBUG nova.service [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 24 01:52:06 compute-0 nova_compute[186999]: 2025-11-24 01:52:06.693 187003 DEBUG nova.servicegroup.drivers.db [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 24 01:52:07 compute-0 nova_compute[186999]: 2025-11-24 01:52:07.695 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:52:07 compute-0 nova_compute[186999]: 2025-11-24 01:52:07.708 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:52:08 compute-0 sshd-session[187299]: Accepted publickey for zuul from 192.168.122.30 port 43062 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 01:52:08 compute-0 systemd-logind[791]: New session 25 of user zuul.
Nov 24 01:52:08 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 24 01:52:08 compute-0 sshd-session[187299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 01:52:09 compute-0 python3.9[187452]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 01:52:10 compute-0 sudo[187606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfakhezoukmzjcsxlqhrkywrcuybrhyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949130.2373412-36-84910592457034/AnsiballZ_systemd_service.py'
Nov 24 01:52:10 compute-0 sudo[187606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:11 compute-0 python3.9[187608]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:52:11 compute-0 systemd[1]: Reloading.
Nov 24 01:52:11 compute-0 systemd-rc-local-generator[187630]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:52:11 compute-0 systemd-sysv-generator[187635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:52:11 compute-0 sudo[187606]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:12 compute-0 python3.9[187792]: ansible-ansible.builtin.service_facts Invoked
Nov 24 01:52:12 compute-0 network[187809]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 01:52:12 compute-0 network[187810]: 'network-scripts' will be removed from distribution in near future.
Nov 24 01:52:12 compute-0 network[187811]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 01:52:17 compute-0 sudo[188084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhvswrtsydxbxwqqnkmzlbtvokecmiyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949137.3192956-55-174907872036713/AnsiballZ_systemd_service.py'
Nov 24 01:52:17 compute-0 sudo[188084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:17 compute-0 python3.9[188086]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:52:17 compute-0 sudo[188084]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:18 compute-0 sudo[188237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwfkkwttkndtumfzetsojjenbkgmvjqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949138.2074447-65-56350322293102/AnsiballZ_file.py'
Nov 24 01:52:18 compute-0 sudo[188237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:18 compute-0 python3.9[188239]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:18 compute-0 sudo[188237]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:18 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:52:18 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 01:52:19 compute-0 sudo[188390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhjjymgaotayvejykvzlrbtkwknmsknr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949138.970208-73-66846594239304/AnsiballZ_file.py'
Nov 24 01:52:19 compute-0 sudo[188390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:19 compute-0 python3.9[188392]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:19 compute-0 sudo[188390]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:21 compute-0 sudo[188542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlatbbtmkgljdpxnrcsnoccotnjvzigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949139.7347555-82-223949131992825/AnsiballZ_command.py'
Nov 24 01:52:21 compute-0 sudo[188542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:21 compute-0 python3.9[188544]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:52:21 compute-0 sudo[188542]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:22 compute-0 python3.9[188696]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:52:22 compute-0 sudo[188846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gssdcukvbcygsjveblqkrcelwdgkdque ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949142.5880103-100-70141594897615/AnsiballZ_systemd_service.py'
Nov 24 01:52:22 compute-0 sudo[188846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:23 compute-0 python3.9[188848]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:52:23 compute-0 systemd[1]: Reloading.
Nov 24 01:52:23 compute-0 systemd-rc-local-generator[188877]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:52:23 compute-0 systemd-sysv-generator[188881]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:52:23 compute-0 sudo[188846]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:23 compute-0 podman[188908]: 2025-11-24 01:52:23.862381152 +0000 UTC m=+0.096084070 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 01:52:24 compute-0 sudo[189050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akokfgjdcpyazoocqyabekozssvhldve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949143.8105402-108-99432832345973/AnsiballZ_command.py'
Nov 24 01:52:24 compute-0 sudo[189050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:24 compute-0 python3.9[189052]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:52:24 compute-0 sudo[189050]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:24 compute-0 sudo[189203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nazfoxrqxhmoktfjjjsbmagsitmkhedv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949144.5524592-117-34264800078018/AnsiballZ_file.py'
Nov 24 01:52:24 compute-0 sudo[189203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:25 compute-0 python3.9[189205]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:52:25 compute-0 sudo[189203]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:25 compute-0 python3.9[189355]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:52:26 compute-0 sshd[128585]: Timeout before authentication for connection from 124.220.16.150 to 38.102.83.32, pid = 169841
Nov 24 01:52:26 compute-0 python3.9[189507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:27 compute-0 python3.9[189628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949146.109506-133-77404768341854/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:52:28 compute-0 sudo[189778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmiftylejztpdnwwskuovzcobstrujrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949147.630758-148-256800098674593/AnsiballZ_group.py'
Nov 24 01:52:28 compute-0 sudo[189778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:28 compute-0 python3.9[189780]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 24 01:52:28 compute-0 sudo[189778]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:29 compute-0 sudo[189938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msnkxsyqiaqekobswgpxkaxtjswvqcbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949148.647784-159-262093562542585/AnsiballZ_getent.py'
Nov 24 01:52:29 compute-0 sudo[189938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:29 compute-0 podman[189904]: 2025-11-24 01:52:29.12373223 +0000 UTC m=+0.074375261 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 01:52:29 compute-0 python3.9[189948]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 24 01:52:29 compute-0 sudo[189938]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:29 compute-0 sudo[190102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpxipvpccpwpxcpdrhtjauivkuielqif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949149.5553656-167-236989797692312/AnsiballZ_group.py'
Nov 24 01:52:29 compute-0 sudo[190102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:30 compute-0 python3.9[190104]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 01:52:30 compute-0 groupadd[190105]: group added to /etc/group: name=ceilometer, GID=42405
Nov 24 01:52:30 compute-0 groupadd[190105]: group added to /etc/gshadow: name=ceilometer
Nov 24 01:52:30 compute-0 groupadd[190105]: new group: name=ceilometer, GID=42405
Nov 24 01:52:30 compute-0 sudo[190102]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:30 compute-0 podman[190111]: 2025-11-24 01:52:30.293552163 +0000 UTC m=+0.107353251 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 01:52:30 compute-0 sudo[190286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coynfmcfnhpepqqhbelzsdkaegdhddek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949150.4292762-175-29532683332207/AnsiballZ_user.py'
Nov 24 01:52:30 compute-0 sudo[190286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:31 compute-0 python3.9[190288]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 01:52:31 compute-0 useradd[190290]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 01:52:31 compute-0 useradd[190290]: add 'ceilometer' to group 'libvirt'
Nov 24 01:52:31 compute-0 useradd[190290]: add 'ceilometer' to shadow group 'libvirt'
Nov 24 01:52:31 compute-0 sudo[190286]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:32 compute-0 python3.9[190446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:32 compute-0 python3.9[190567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763949151.9141066-201-198392477850007/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:33 compute-0 python3.9[190717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:34 compute-0 python3.9[190838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763949153.138136-201-213661218781011/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:34 compute-0 python3.9[190988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:35 compute-0 python3.9[191109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763949154.3651793-201-117467829892494/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:36 compute-0 python3.9[191259]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:52:36 compute-0 python3.9[191411]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:52:37 compute-0 python3.9[191563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:38 compute-0 python3.9[191684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949157.1248262-260-261164427304475/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:38 compute-0 python3.9[191834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:39 compute-0 python3.9[191910]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:39 compute-0 python3.9[192060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:40 compute-0 python3.9[192181]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949159.494146-260-266643565875223/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:41 compute-0 python3.9[192331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:41 compute-0 python3.9[192452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949160.6637478-260-71050354710055/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:42 compute-0 python3.9[192602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:42 compute-0 python3.9[192723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949161.8845584-260-178762670028922/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:43 compute-0 python3.9[192873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:44 compute-0 python3.9[192994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949163.0804653-260-38490476044024/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:44 compute-0 python3.9[193144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:45 compute-0 python3.9[193265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949164.2028399-260-85608527187601/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:45 compute-0 python3.9[193415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:46 compute-0 python3.9[193536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949165.4889367-260-45044355861019/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:47 compute-0 python3.9[193686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:47 compute-0 python3.9[193807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949166.65872-260-104399427881727/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:52:48.411 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:52:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:52:48.411 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:52:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:52:48.411 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:52:48 compute-0 python3.9[193957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:48 compute-0 python3.9[194078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949167.948496-260-9205758632691/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:49 compute-0 python3.9[194228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:50 compute-0 python3.9[194349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949169.1323922-260-217810016600236/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:51 compute-0 python3.9[194499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:51 compute-0 python3.9[194575]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:52 compute-0 python3.9[194725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:52 compute-0 python3.9[194801]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:53 compute-0 python3.9[194951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:53 compute-0 python3.9[195027]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:54 compute-0 podman[195028]: 2025-11-24 01:52:54.022050932 +0000 UTC m=+0.067478474 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:52:54 compute-0 sudo[195198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldnilonwvvyxlgajzeqshbdbcpppsbtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949174.152268-449-84054080149111/AnsiballZ_file.py'
Nov 24 01:52:54 compute-0 sudo[195198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:54 compute-0 python3.9[195200]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:54 compute-0 sudo[195198]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:55 compute-0 sudo[195350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gybtkqlfntocohcciijahgftmxtskpic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949174.878982-457-107359465147787/AnsiballZ_file.py'
Nov 24 01:52:55 compute-0 sudo[195350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:55 compute-0 python3.9[195352]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:52:55 compute-0 sudo[195350]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:56 compute-0 sudo[195502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urttqzguqpkscnwvtogrqujpnlcbqqxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949175.6815145-465-263403215007181/AnsiballZ_file.py'
Nov 24 01:52:56 compute-0 sudo[195502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:56 compute-0 python3.9[195504]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:52:56 compute-0 sudo[195502]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:56 compute-0 sudo[195654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esamwpbtcmzfxuisqerywgfxtyhgjhly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949176.49422-473-65487940478143/AnsiballZ_systemd_service.py'
Nov 24 01:52:56 compute-0 sudo[195654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:57 compute-0 python3.9[195656]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:52:57 compute-0 systemd[1]: Reloading.
Nov 24 01:52:57 compute-0 systemd-sysv-generator[195688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:52:57 compute-0 systemd-rc-local-generator[195684]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:52:57 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 24 01:52:57 compute-0 sudo[195654]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:58 compute-0 sudo[195845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrswjrjvejgpenwwexqgronxupxsflpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/AnsiballZ_stat.py'
Nov 24 01:52:58 compute-0 sudo[195845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:58 compute-0 python3.9[195847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:58 compute-0 sudo[195845]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:58 compute-0 sudo[195968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbqwtwebrnseyixtvfgmdfophtknjdyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/AnsiballZ_copy.py'
Nov 24 01:52:58 compute-0 sudo[195968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:59 compute-0 python3.9[195970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:52:59 compute-0 sudo[195968]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:59 compute-0 sudo[196057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibapawbtbvsdhnucjdewstslfpelwdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/AnsiballZ_stat.py'
Nov 24 01:52:59 compute-0 sudo[196057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:52:59 compute-0 podman[196018]: 2025-11-24 01:52:59.321747504 +0000 UTC m=+0.078389975 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 01:52:59 compute-0 python3.9[196066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:52:59 compute-0 sudo[196057]: pam_unix(sudo:session): session closed for user root
Nov 24 01:52:59 compute-0 sudo[196187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxlbxyleozjikptgfpvqbmrnectvvlix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/AnsiballZ_copy.py'
Nov 24 01:52:59 compute-0 sudo[196187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:00 compute-0 python3.9[196189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949177.917881-482-38074607534352/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:53:00 compute-0 sudo[196187]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:00 compute-0 podman[196268]: 2025-11-24 01:53:00.902521679 +0000 UTC m=+0.155071800 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 24 01:53:01 compute-0 sshd-session[196214]: Invalid user admin from 46.188.119.26 port 34798
Nov 24 01:53:01 compute-0 sudo[196367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmdsntkcohxtsevzyjjmeezqylonlfoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949180.481918-510-109871064393904/AnsiballZ_container_config_data.py'
Nov 24 01:53:01 compute-0 sudo[196367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:01 compute-0 sshd-session[196214]: Received disconnect from 46.188.119.26 port 34798:11: Bye Bye [preauth]
Nov 24 01:53:01 compute-0 sshd-session[196214]: Disconnected from invalid user admin 46.188.119.26 port 34798 [preauth]
Nov 24 01:53:01 compute-0 python3.9[196369]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 24 01:53:01 compute-0 sudo[196367]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:01 compute-0 sudo[196519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqmrtttgaqxnucqsowgikuistlynjxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949181.4855564-519-256229241644119/AnsiballZ_container_config_hash.py'
Nov 24 01:53:01 compute-0 sudo[196519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:02 compute-0 python3.9[196521]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:53:02 compute-0 sudo[196519]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:02 compute-0 sudo[196671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfsxfvnngokunqtvogrmkaysdnzklosy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949182.4183073-529-246575603662873/AnsiballZ_edpm_container_manage.py'
Nov 24 01:53:02 compute-0 sudo[196671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:03 compute-0 python3[196673]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:53:03 compute-0 podman[196711]: 2025-11-24 01:53:03.439506619 +0000 UTC m=+0.051944641 container create 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true)
Nov 24 01:53:03 compute-0 podman[196711]: 2025-11-24 01:53:03.412827199 +0000 UTC m=+0.025265211 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 24 01:53:03 compute-0 python3[196673]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 24 01:53:03 compute-0 sudo[196671]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.774 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.774 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.786 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.786 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.786 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.787 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.787 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.787 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.787 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.788 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.811 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.812 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.812 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.812 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.984 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.987 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6172MB free_disk=73.6633071899414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.987 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:53:03 compute-0 nova_compute[186999]: 2025-11-24 01:53:03.987 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.087 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.088 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.115 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.125 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.127 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:53:04 compute-0 nova_compute[186999]: 2025-11-24 01:53:04.127 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:53:04 compute-0 sudo[196899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gajzdxxlfkyjwzfdqgdgspqhcebrxuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949183.8287773-537-247141721677504/AnsiballZ_stat.py'
Nov 24 01:53:04 compute-0 sudo[196899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:04 compute-0 python3.9[196901]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:53:04 compute-0 sudo[196899]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:04 compute-0 sudo[197053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caxtgzbiqkmzbzforurajovxcziosewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949184.6563907-546-117189067860438/AnsiballZ_file.py'
Nov 24 01:53:04 compute-0 sudo[197053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:05 compute-0 python3.9[197055]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:05 compute-0 sudo[197053]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:05 compute-0 sudo[197204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsiueffpwfqhjfdoybaogldaxfjorvww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949185.2713118-546-58225346311500/AnsiballZ_copy.py'
Nov 24 01:53:05 compute-0 sudo[197204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:05 compute-0 python3.9[197206]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949185.2713118-546-58225346311500/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:05 compute-0 sudo[197204]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:06 compute-0 sudo[197280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqkfhtibgkjlvgteobszrxywusjgcsba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949185.2713118-546-58225346311500/AnsiballZ_systemd.py'
Nov 24 01:53:06 compute-0 sudo[197280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:06 compute-0 python3.9[197282]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:53:06 compute-0 systemd[1]: Reloading.
Nov 24 01:53:07 compute-0 systemd-rc-local-generator[197308]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:07 compute-0 systemd-sysv-generator[197313]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:07 compute-0 sudo[197280]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:07 compute-0 sudo[197392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpyttzdrlrujfoensjcpxqmxscfrqwsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949185.2713118-546-58225346311500/AnsiballZ_systemd.py'
Nov 24 01:53:07 compute-0 sudo[197392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:07 compute-0 python3.9[197394]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:53:07 compute-0 systemd[1]: Reloading.
Nov 24 01:53:08 compute-0 systemd-sysv-generator[197421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:08 compute-0 systemd-rc-local-generator[197418]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:08 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 24 01:53:08 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.
Nov 24 01:53:08 compute-0 podman[197434]: 2025-11-24 01:53:08.394617709 +0000 UTC m=+0.140190787 container init 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + sudo -E kolla_set_configs
Nov 24 01:53:08 compute-0 sudo[197455]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: sudo: unable to send audit message: Operation not permitted
Nov 24 01:53:08 compute-0 sudo[197455]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:53:08 compute-0 sudo[197455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 01:53:08 compute-0 podman[197434]: 2025-11-24 01:53:08.432239642 +0000 UTC m=+0.177812700 container start 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 01:53:08 compute-0 podman[197434]: ceilometer_agent_compute
Nov 24 01:53:08 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Validating config file
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Copying service configuration files
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: INFO:__main__:Writing out command to execute
Nov 24 01:53:08 compute-0 sudo[197455]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: ++ cat /run_command
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + ARGS=
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + sudo kolla_copy_cacerts
Nov 24 01:53:08 compute-0 sudo[197392]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:08 compute-0 sudo[197469]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: sudo: unable to send audit message: Operation not permitted
Nov 24 01:53:08 compute-0 sudo[197469]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:53:08 compute-0 sudo[197469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 01:53:08 compute-0 sudo[197469]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + [[ ! -n '' ]]
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + . kolla_extend_start
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + umask 0022
Nov 24 01:53:08 compute-0 ceilometer_agent_compute[197448]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 24 01:53:08 compute-0 podman[197454]: 2025-11-24 01:53:08.54163299 +0000 UTC m=+0.090174142 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 01:53:08 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-311e70911a6d0cf2.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:53:08 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-311e70911a6d0cf2.service: Failed with result 'exit-code'.
Nov 24 01:53:09 compute-0 sudo[197627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drpspvvscbujkyqmjjswadivazuirzjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949188.666731-570-94671266191743/AnsiballZ_systemd.py'
Nov 24 01:53:09 compute-0 sudo[197627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.373 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.374 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.375 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.376 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 python3.9[197629]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.377 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.378 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.379 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.382 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.383 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.384 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.385 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.386 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.387 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.388 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.389 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.409 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.411 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.411 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 24 01:53:09 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.505 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.512 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.595 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.596 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.597 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.598 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.599 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.600 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.601 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.602 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.603 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.604 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.605 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.606 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.607 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.608 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.609 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.610 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.611 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.612 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.613 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.614 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.615 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.616 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.617 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.617 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.617 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 24 01:53:09 compute-0 ceilometer_agent_compute[197448]: 2025-11-24 01:53:09.627 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 24 01:53:09 compute-0 virtqemud[186602]: End of file while reading data: Input/output error
Nov 24 01:53:09 compute-0 systemd[1]: libpod-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope: Deactivated successfully.
Nov 24 01:53:09 compute-0 systemd[1]: libpod-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope: Consumed 1.361s CPU time.
Nov 24 01:53:09 compute-0 podman[197636]: 2025-11-24 01:53:09.773406728 +0000 UTC m=+0.319241180 container died 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:53:09 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-311e70911a6d0cf2.timer: Deactivated successfully.
Nov 24 01:53:09 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.
Nov 24 01:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-userdata-shm.mount: Deactivated successfully.
Nov 24 01:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d-merged.mount: Deactivated successfully.
Nov 24 01:53:09 compute-0 podman[197636]: 2025-11-24 01:53:09.830516396 +0000 UTC m=+0.376350798 container cleanup 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 24 01:53:09 compute-0 podman[197636]: ceilometer_agent_compute
Nov 24 01:53:09 compute-0 podman[197666]: ceilometer_agent_compute
Nov 24 01:53:09 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 24 01:53:09 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 24 01:53:09 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 24 01:53:10 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72246bb88eabbfa4f4f7bf4555f287f59cb867906be79aecc01414e78821c35d/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:10 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.
Nov 24 01:53:10 compute-0 podman[197679]: 2025-11-24 01:53:10.049176058 +0000 UTC m=+0.124800868 container init 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + sudo -E kolla_set_configs
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: sudo: unable to send audit message: Operation not permitted
Nov 24 01:53:10 compute-0 sudo[197701]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 01:53:10 compute-0 sudo[197701]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:53:10 compute-0 sudo[197701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 01:53:10 compute-0 podman[197679]: 2025-11-24 01:53:10.076179038 +0000 UTC m=+0.151803828 container start 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 01:53:10 compute-0 podman[197679]: ceilometer_agent_compute
Nov 24 01:53:10 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 24 01:53:10 compute-0 sudo[197627]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Validating config file
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Copying service configuration files
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: INFO:__main__:Writing out command to execute
Nov 24 01:53:10 compute-0 sudo[197701]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: ++ cat /run_command
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + ARGS=
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + sudo kolla_copy_cacerts
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: sudo: unable to send audit message: Operation not permitted
Nov 24 01:53:10 compute-0 sudo[197725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 01:53:10 compute-0 sudo[197725]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 01:53:10 compute-0 sudo[197725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 01:53:10 compute-0 sudo[197725]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + [[ ! -n '' ]]
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + . kolla_extend_start
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + umask 0022
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 24 01:53:10 compute-0 podman[197702]: 2025-11-24 01:53:10.164646439 +0000 UTC m=+0.071088097 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 01:53:10 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-46169b86c3d63c14.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:53:10 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-46169b86c3d63c14.service: Failed with result 'exit-code'.
Nov 24 01:53:10 compute-0 sudo[197876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imyikbjexyguusxmfxcarunxkdbttyoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949190.2936497-578-254095224005219/AnsiballZ_stat.py'
Nov 24 01:53:10 compute-0 sudo[197876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:10 compute-0 python3.9[197878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:53:10 compute-0 sudo[197876]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.996 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.997 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.998 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:10 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:10.999 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.000 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.001 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.005 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.006 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.011 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.011 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.011 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.011 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.011 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.032 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.035 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.037 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.063 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 24 01:53:11 compute-0 sudo[198002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgruybcevvzfoaykfanorwbcfcmaffbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949190.2936497-578-254095224005219/AnsiballZ_copy.py'
Nov 24 01:53:11 compute-0 sudo[198002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.203 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.203 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.203 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.203 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.203 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.204 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.205 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.206 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.207 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.208 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.209 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.210 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.211 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.212 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.213 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.214 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.215 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.216 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.217 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.218 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.219 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.230 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.231 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.236 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.246 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:53:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:53:11 compute-0 python3.9[198004]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949190.2936497-578-254095224005219/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:53:11 compute-0 sudo[198002]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:12 compute-0 sudo[198157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oceqnlibephcgykwafplexnfcwxdmvml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949191.7131343-595-12696734411616/AnsiballZ_container_config_data.py'
Nov 24 01:53:12 compute-0 sudo[198157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:12 compute-0 python3.9[198159]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 24 01:53:12 compute-0 sudo[198157]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:12 compute-0 sudo[198309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xegchrpphioljzobkjpkgsszwvhqnxtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949192.5417583-604-271957590429661/AnsiballZ_container_config_hash.py'
Nov 24 01:53:12 compute-0 sudo[198309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:13 compute-0 python3.9[198311]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:53:13 compute-0 sudo[198309]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:13 compute-0 sudo[198461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anrshglhkxjcrnhysvdyigpsqbedjllw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949193.4157925-614-228732689284115/AnsiballZ_edpm_container_manage.py'
Nov 24 01:53:13 compute-0 sudo[198461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:14 compute-0 python3[198463]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:53:14 compute-0 podman[198501]: 2025-11-24 01:53:14.323725321 +0000 UTC m=+0.056065709 container create b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Nov 24 01:53:14 compute-0 podman[198501]: 2025-11-24 01:53:14.288545129 +0000 UTC m=+0.020885527 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 24 01:53:14 compute-0 python3[198463]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 24 01:53:14 compute-0 sudo[198461]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:14 compute-0 sudo[198690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjlavmgxlovfhatqqchmjmovzyxqqomk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949194.675551-622-171173049557954/AnsiballZ_stat.py'
Nov 24 01:53:14 compute-0 sudo[198690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:15 compute-0 python3.9[198692]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:53:15 compute-0 sudo[198690]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:15 compute-0 sudo[198844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljrpcbqpsbiufcpwsmtypjjrolpgemxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949195.4491518-631-233792120140866/AnsiballZ_file.py'
Nov 24 01:53:15 compute-0 sudo[198844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:15 compute-0 python3.9[198846]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:15 compute-0 sudo[198844]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:16 compute-0 sudo[198995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgyiyepkurjpfndjchneltqelwtryyfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949196.0570617-631-75862332286917/AnsiballZ_copy.py'
Nov 24 01:53:16 compute-0 sudo[198995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:16 compute-0 python3.9[198997]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949196.0570617-631-75862332286917/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:16 compute-0 sudo[198995]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:17 compute-0 sudo[199071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peconkgqubbbnbxfbugjofcblcanprff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949196.0570617-631-75862332286917/AnsiballZ_systemd.py'
Nov 24 01:53:17 compute-0 sudo[199071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:17 compute-0 python3.9[199073]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:53:17 compute-0 systemd[1]: Reloading.
Nov 24 01:53:17 compute-0 systemd-sysv-generator[199102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:17 compute-0 systemd-rc-local-generator[199098]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:17 compute-0 sudo[199071]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:17 compute-0 sudo[199181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsduqdglbacyyqbfiplgpgfevenxwvam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949196.0570617-631-75862332286917/AnsiballZ_systemd.py'
Nov 24 01:53:17 compute-0 sudo[199181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:18 compute-0 python3.9[199183]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:53:18 compute-0 systemd[1]: Reloading.
Nov 24 01:53:18 compute-0 systemd-rc-local-generator[199212]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:18 compute-0 systemd-sysv-generator[199216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:18 compute-0 systemd[1]: Starting node_exporter container...
Nov 24 01:53:18 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a29fe5a410a1d00cca4146e760bc0b2a7d6e02515188cc2bd776134b8a7b33/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a29fe5a410a1d00cca4146e760bc0b2a7d6e02515188cc2bd776134b8a7b33/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.
Nov 24 01:53:18 compute-0 podman[199223]: 2025-11-24 01:53:18.804160952 +0000 UTC m=+0.118111168 container init b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.818Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.818Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.818Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=arp
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=bcache
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=bonding
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=cpu
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=edac
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=filefd
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=netclass
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=netdev
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=netstat
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=nfs
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=nvme
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.819Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=softnet
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=systemd
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=xfs
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=node_exporter.go:117 level=info collector=zfs
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.820Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 24 01:53:18 compute-0 node_exporter[199239]: ts=2025-11-24T01:53:18.821Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 24 01:53:18 compute-0 podman[199223]: 2025-11-24 01:53:18.831500281 +0000 UTC m=+0.145450487 container start b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:53:18 compute-0 podman[199223]: node_exporter
Nov 24 01:53:18 compute-0 systemd[1]: Started node_exporter container.
Nov 24 01:53:18 compute-0 sudo[199181]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:18 compute-0 podman[199248]: 2025-11-24 01:53:18.89884108 +0000 UTC m=+0.054667269 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 01:53:19 compute-0 sudo[199422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukvyvccsjuwqcdvqykkiasaouizuxyjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949199.0686824-655-35079853430346/AnsiballZ_systemd.py'
Nov 24 01:53:19 compute-0 sudo[199422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:19 compute-0 python3.9[199424]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:53:19 compute-0 systemd[1]: Stopping node_exporter container...
Nov 24 01:53:19 compute-0 systemd[1]: libpod-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.scope: Deactivated successfully.
Nov 24 01:53:19 compute-0 podman[199428]: 2025-11-24 01:53:19.908130658 +0000 UTC m=+0.055578986 container died b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 01:53:19 compute-0 systemd[1]: b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213-7659ab3d2ff42a4a.timer: Deactivated successfully.
Nov 24 01:53:19 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.
Nov 24 01:53:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213-userdata-shm.mount: Deactivated successfully.
Nov 24 01:53:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-b1a29fe5a410a1d00cca4146e760bc0b2a7d6e02515188cc2bd776134b8a7b33-merged.mount: Deactivated successfully.
Nov 24 01:53:19 compute-0 podman[199428]: 2025-11-24 01:53:19.946367757 +0000 UTC m=+0.093816125 container cleanup b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:53:19 compute-0 podman[199428]: node_exporter
Nov 24 01:53:19 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 01:53:20 compute-0 podman[199455]: node_exporter
Nov 24 01:53:20 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 24 01:53:20 compute-0 systemd[1]: Stopped node_exporter container.
Nov 24 01:53:20 compute-0 systemd[1]: Starting node_exporter container...
Nov 24 01:53:20 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a29fe5a410a1d00cca4146e760bc0b2a7d6e02515188cc2bd776134b8a7b33/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a29fe5a410a1d00cca4146e760bc0b2a7d6e02515188cc2bd776134b8a7b33/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:20 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.
Nov 24 01:53:20 compute-0 podman[199468]: 2025-11-24 01:53:20.170691551 +0000 UTC m=+0.127027041 container init b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.183Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.183Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.183Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.183Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=arp
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=bcache
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=bonding
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=cpu
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=edac
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=filefd
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=netclass
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=netdev
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=netstat
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.184Z caller=node_exporter.go:117 level=info collector=nfs
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=nvme
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=softnet
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=systemd
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=xfs
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=node_exporter.go:117 level=info collector=zfs
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 24 01:53:20 compute-0 node_exporter[199484]: ts=2025-11-24T01:53:20.185Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 24 01:53:20 compute-0 podman[199468]: 2025-11-24 01:53:20.211207136 +0000 UTC m=+0.167542616 container start b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 01:53:20 compute-0 podman[199468]: node_exporter
Nov 24 01:53:20 compute-0 systemd[1]: Started node_exporter container.
Nov 24 01:53:20 compute-0 sudo[199422]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:20 compute-0 podman[199494]: 2025-11-24 01:53:20.277364112 +0000 UTC m=+0.053670051 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 01:53:20 compute-0 sudo[199667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojsjhtmgazrrwowrirvvswbwbtefyyir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949200.442569-663-40349345926938/AnsiballZ_stat.py'
Nov 24 01:53:20 compute-0 sudo[199667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:20 compute-0 python3.9[199669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:53:20 compute-0 sudo[199667]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:21 compute-0 sudo[199790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prdakercqyfihtehqwbuswvwbsnwkjvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949200.442569-663-40349345926938/AnsiballZ_copy.py'
Nov 24 01:53:21 compute-0 sudo[199790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:21 compute-0 python3.9[199792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949200.442569-663-40349345926938/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:53:21 compute-0 sudo[199790]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:22 compute-0 sudo[199942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohctwnsdklyuohqicmuuxypolkkaftyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949202.205742-680-120428562409201/AnsiballZ_container_config_data.py'
Nov 24 01:53:22 compute-0 sudo[199942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:22 compute-0 python3.9[199944]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 24 01:53:22 compute-0 sudo[199942]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:23 compute-0 sudo[200096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiqahpxsnwwxmvfownghvsblugcxfhnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949202.9664586-689-171790715803268/AnsiballZ_container_config_hash.py'
Nov 24 01:53:23 compute-0 sudo[200096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:23 compute-0 python3.9[200098]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:53:23 compute-0 sudo[200096]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:24 compute-0 sudo[200248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efgwbyvuzfkogqyvwtgefmtwutwprnan ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949203.7678103-699-147794181004978/AnsiballZ_edpm_container_manage.py'
Nov 24 01:53:24 compute-0 sudo[200248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:24 compute-0 podman[200250]: 2025-11-24 01:53:24.131647999 +0000 UTC m=+0.066005581 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:53:24 compute-0 python3[200251]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:53:24 compute-0 sshd-session[200021]: Invalid user gits from 154.90.59.75 port 56826
Nov 24 01:53:24 compute-0 sshd-session[200021]: Received disconnect from 154.90.59.75 port 56826:11: Bye Bye [preauth]
Nov 24 01:53:24 compute-0 sshd-session[200021]: Disconnected from invalid user gits 154.90.59.75 port 56826 [preauth]
Nov 24 01:53:25 compute-0 podman[200282]: 2025-11-24 01:53:25.711613735 +0000 UTC m=+1.337626765 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 24 01:53:25 compute-0 podman[200378]: 2025-11-24 01:53:25.890785198 +0000 UTC m=+0.050667044 container create 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 24 01:53:25 compute-0 podman[200378]: 2025-11-24 01:53:25.866155506 +0000 UTC m=+0.026037382 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 24 01:53:25 compute-0 python3[200251]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 24 01:53:26 compute-0 sudo[200248]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:26 compute-0 sudo[200566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvvcvmludawhyewbvtyvoymueulhexat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949206.2675812-707-234885610968579/AnsiballZ_stat.py'
Nov 24 01:53:26 compute-0 sudo[200566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:26 compute-0 python3.9[200568]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:53:26 compute-0 sudo[200566]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:27 compute-0 auditd[703]: Audit daemon rotating log files
Nov 24 01:53:27 compute-0 sudo[200720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elutipqmjxskdikciwdasnpotccsautw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949207.049045-716-221375880574614/AnsiballZ_file.py'
Nov 24 01:53:27 compute-0 sudo[200720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:27 compute-0 python3.9[200722]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:27 compute-0 sudo[200720]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:28 compute-0 sudo[200871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dypoqfumhrdzwzbztlnxtbxulibdjydb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949207.6275249-716-42467860788133/AnsiballZ_copy.py'
Nov 24 01:53:28 compute-0 sudo[200871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:28 compute-0 python3.9[200873]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949207.6275249-716-42467860788133/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:28 compute-0 sudo[200871]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:28 compute-0 sudo[200947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcdpmcgylzdooyunvkimtonuvumnxpmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949207.6275249-716-42467860788133/AnsiballZ_systemd.py'
Nov 24 01:53:28 compute-0 sudo[200947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:28 compute-0 python3.9[200949]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:53:28 compute-0 systemd[1]: Reloading.
Nov 24 01:53:29 compute-0 systemd-rc-local-generator[200977]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:29 compute-0 systemd-sysv-generator[200981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:29 compute-0 sudo[200947]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:29 compute-0 sudo[201068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnqhywioqdpbisfcokwkuczelldbikgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949207.6275249-716-42467860788133/AnsiballZ_systemd.py'
Nov 24 01:53:29 compute-0 sudo[201068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:29 compute-0 podman[201032]: 2025-11-24 01:53:29.584646666 +0000 UTC m=+0.057880679 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 24 01:53:29 compute-0 python3.9[201077]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:53:29 compute-0 systemd[1]: Reloading.
Nov 24 01:53:29 compute-0 systemd-rc-local-generator[201106]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:30 compute-0 systemd-sysv-generator[201110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:30 compute-0 systemd[1]: Starting podman_exporter container...
Nov 24 01:53:30 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1368bb2bbcb87711706b3e2b1b5185487cdef2ebbbe3a65cd8acf06dca93dcf7/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1368bb2bbcb87711706b3e2b1b5185487cdef2ebbbe3a65cd8acf06dca93dcf7/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.
Nov 24 01:53:30 compute-0 podman[201120]: 2025-11-24 01:53:30.398205406 +0000 UTC m=+0.170040044 container init 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.422Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.422Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.422Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.422Z caller=handler.go:105 level=info collector=container
Nov 24 01:53:30 compute-0 podman[201120]: 2025-11-24 01:53:30.436206348 +0000 UTC m=+0.208040946 container start 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:53:30 compute-0 podman[201120]: podman_exporter
Nov 24 01:53:30 compute-0 systemd[1]: Starting Podman API Service...
Nov 24 01:53:30 compute-0 systemd[1]: Started Podman API Service.
Nov 24 01:53:30 compute-0 systemd[1]: Started podman_exporter container.
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="Setting parallel job count to 25"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="Using sqlite as database backend"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 24 01:53:30 compute-0 podman[201147]: @ - - [24/Nov/2025:01:53:30 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 01:53:30 compute-0 podman[201147]: time="2025-11-24T01:53:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 01:53:30 compute-0 sudo[201068]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:30 compute-0 podman[201147]: @ - - [24/Nov/2025:01:53:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.523Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.524Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 01:53:30 compute-0 podman_exporter[201135]: ts=2025-11-24T01:53:30.525Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 01:53:30 compute-0 podman[201144]: 2025-11-24 01:53:30.5331836 +0000 UTC m=+0.076478919 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:53:30 compute-0 systemd[1]: 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b-44aca2da347a1bfc.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:53:30 compute-0 systemd[1]: 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b-44aca2da347a1bfc.service: Failed with result 'exit-code'.
Nov 24 01:53:31 compute-0 sudo[201343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgvavdumwgheaoobjhquvuknvasfzlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949210.7019386-740-51604689824683/AnsiballZ_systemd.py'
Nov 24 01:53:31 compute-0 sudo[201343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:31 compute-0 podman[201304]: 2025-11-24 01:53:31.119108597 +0000 UTC m=+0.125969549 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 01:53:31 compute-0 python3.9[201349]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:53:31 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 24 01:53:31 compute-0 podman[201147]: @ - - [24/Nov/2025:01:53:30 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3437 "" "Go-http-client/1.1"
Nov 24 01:53:31 compute-0 systemd[1]: libpod-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.scope: Deactivated successfully.
Nov 24 01:53:31 compute-0 podman[201362]: 2025-11-24 01:53:31.528162556 +0000 UTC m=+0.068760189 container died 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:53:31 compute-0 systemd[1]: 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b-44aca2da347a1bfc.timer: Deactivated successfully.
Nov 24 01:53:31 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.
Nov 24 01:53:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b-userdata-shm.mount: Deactivated successfully.
Nov 24 01:53:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-1368bb2bbcb87711706b3e2b1b5185487cdef2ebbbe3a65cd8acf06dca93dcf7-merged.mount: Deactivated successfully.
Nov 24 01:53:31 compute-0 podman[201362]: 2025-11-24 01:53:31.743483819 +0000 UTC m=+0.284081442 container cleanup 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:53:31 compute-0 podman[201362]: podman_exporter
Nov 24 01:53:31 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 01:53:31 compute-0 podman[201391]: podman_exporter
Nov 24 01:53:31 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 24 01:53:31 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 24 01:53:31 compute-0 systemd[1]: Starting podman_exporter container...
Nov 24 01:53:31 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1368bb2bbcb87711706b3e2b1b5185487cdef2ebbbe3a65cd8acf06dca93dcf7/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1368bb2bbcb87711706b3e2b1b5185487cdef2ebbbe3a65cd8acf06dca93dcf7/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:32 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.
Nov 24 01:53:32 compute-0 podman[201404]: 2025-11-24 01:53:32.013340804 +0000 UTC m=+0.141346726 container init 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.032Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.032Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.032Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.032Z caller=handler.go:105 level=info collector=container
Nov 24 01:53:32 compute-0 podman[201147]: @ - - [24/Nov/2025:01:53:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 01:53:32 compute-0 podman[201147]: time="2025-11-24T01:53:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 01:53:32 compute-0 podman[201404]: 2025-11-24 01:53:32.046993413 +0000 UTC m=+0.174999255 container start 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:53:32 compute-0 podman[201404]: podman_exporter
Nov 24 01:53:32 compute-0 podman[201147]: @ - - [24/Nov/2025:01:53:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19570 "" "Go-http-client/1.1"
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.054Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 01:53:32 compute-0 systemd[1]: Started podman_exporter container.
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.055Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 01:53:32 compute-0 podman_exporter[201419]: ts=2025-11-24T01:53:32.055Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 01:53:32 compute-0 sudo[201343]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:32 compute-0 podman[201429]: 2025-11-24 01:53:32.109136422 +0000 UTC m=+0.049165601 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 01:53:32 compute-0 sudo[201603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyhbzdfefrahrlngqarvwopzlfgjzpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949212.24545-748-1061572367511/AnsiballZ_stat.py'
Nov 24 01:53:32 compute-0 sudo[201603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:32 compute-0 python3.9[201605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:53:32 compute-0 sudo[201603]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:33 compute-0 sudo[201726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziuztponsuetqgoekssdhzgrzruwklwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949212.24545-748-1061572367511/AnsiballZ_copy.py'
Nov 24 01:53:33 compute-0 sudo[201726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:33 compute-0 python3.9[201728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763949212.24545-748-1061572367511/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 01:53:33 compute-0 sudo[201726]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:33 compute-0 sudo[201878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-criwlfunwqtyckwqhgugzgwseioubzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949213.6491468-765-192705231221915/AnsiballZ_container_config_data.py'
Nov 24 01:53:33 compute-0 sudo[201878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:34 compute-0 python3.9[201880]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 24 01:53:34 compute-0 sudo[201878]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:34 compute-0 sudo[202030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihtszpwsxwyswduophwhoeanopgcizer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949214.3834121-774-125398317150004/AnsiballZ_container_config_hash.py'
Nov 24 01:53:34 compute-0 sudo[202030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:34 compute-0 python3.9[202032]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 01:53:34 compute-0 sudo[202030]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:35 compute-0 sudo[202182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkuwwirilemrlybnmvyqjfsqqttluhv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949215.1778798-784-276548194952100/AnsiballZ_edpm_container_manage.py'
Nov 24 01:53:35 compute-0 sudo[202182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:35 compute-0 python3[202184]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 01:53:38 compute-0 podman[202197]: 2025-11-24 01:53:38.358731476 +0000 UTC m=+2.539196675 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 01:53:38 compute-0 podman[202294]: 2025-11-24 01:53:38.509814779 +0000 UTC m=+0.047203415 container create 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 01:53:38 compute-0 podman[202294]: 2025-11-24 01:53:38.484053935 +0000 UTC m=+0.021442571 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 01:53:38 compute-0 python3[202184]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 01:53:38 compute-0 sudo[202182]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:39 compute-0 sudo[202481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywxyqgjqfxpsbmtonfgjsrogizzrsrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949218.8164256-792-24876195648632/AnsiballZ_stat.py'
Nov 24 01:53:39 compute-0 sudo[202481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:39 compute-0 python3.9[202483]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:53:39 compute-0 sudo[202481]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:39 compute-0 sudo[202635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cysxmuxckorvqsczcbyuopstzejcobwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949219.6378295-801-3725182804040/AnsiballZ_file.py'
Nov 24 01:53:39 compute-0 sudo[202635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:40 compute-0 python3.9[202637]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:40 compute-0 sudo[202635]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:40 compute-0 sudo[202799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoqqewnkprmzvvsalblfdxxhfhxuolfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949220.2311926-801-267944670221967/AnsiballZ_copy.py'
Nov 24 01:53:40 compute-0 sudo[202799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:40 compute-0 podman[202760]: 2025-11-24 01:53:40.762309928 +0000 UTC m=+0.071947410 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 01:53:40 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-46169b86c3d63c14.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 01:53:40 compute-0 systemd[1]: 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96-46169b86c3d63c14.service: Failed with result 'exit-code'.
Nov 24 01:53:40 compute-0 python3.9[202805]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763949220.2311926-801-267944670221967/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:40 compute-0 sudo[202799]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:41 compute-0 sudo[202881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfstibcehxeivwicgrqvkziiraefiqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949220.2311926-801-267944670221967/AnsiballZ_systemd.py'
Nov 24 01:53:41 compute-0 sudo[202881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:41 compute-0 python3.9[202883]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 01:53:41 compute-0 systemd[1]: Reloading.
Nov 24 01:53:41 compute-0 systemd-sysv-generator[202912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:41 compute-0 systemd-rc-local-generator[202907]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:41 compute-0 sudo[202881]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:42 compute-0 sudo[202992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjphfmxtjjlkekanucrzacigdtiujppg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949220.2311926-801-267944670221967/AnsiballZ_systemd.py'
Nov 24 01:53:42 compute-0 sudo[202992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:42 compute-0 python3.9[202994]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 01:53:42 compute-0 systemd[1]: Reloading.
Nov 24 01:53:42 compute-0 systemd-rc-local-generator[203023]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 01:53:42 compute-0 systemd-sysv-generator[203029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 01:53:42 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 24 01:53:43 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.
Nov 24 01:53:43 compute-0 podman[203034]: 2025-11-24 01:53:43.139537441 +0000 UTC m=+0.158013451 container init 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9)
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *bridge.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *coverage.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *datapath.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *iface.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *memory.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *ovnnorthd.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *ovn.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *ovsdbserver.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *pmd_perf.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *pmd_rxq.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: INFO    01:53:43 main.go:48: registering *vswitch.Collector
Nov 24 01:53:43 compute-0 openstack_network_exporter[203049]: NOTICE  01:53:43 main.go:76: listening on https://:9105/metrics
Nov 24 01:53:43 compute-0 podman[203034]: 2025-11-24 01:53:43.185713116 +0000 UTC m=+0.204189116 container start 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git)
Nov 24 01:53:43 compute-0 podman[203034]: openstack_network_exporter
Nov 24 01:53:43 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 24 01:53:43 compute-0 sudo[202992]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:43 compute-0 podman[203060]: 2025-11-24 01:53:43.315591215 +0000 UTC m=+0.109354856 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64)
Nov 24 01:53:43 compute-0 sudo[203230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmaphiyvkkzibxifdscskmbnojgyjtrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949223.467316-825-266084011195140/AnsiballZ_systemd.py'
Nov 24 01:53:43 compute-0 sudo[203230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:44 compute-0 python3.9[203232]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 01:53:44 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 24 01:53:44 compute-0 systemd[1]: libpod-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.scope: Deactivated successfully.
Nov 24 01:53:44 compute-0 podman[203236]: 2025-11-24 01:53:44.289693416 +0000 UTC m=+0.060631978 container died 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 01:53:44 compute-0 systemd[1]: 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5-7257d28757adad81.timer: Deactivated successfully.
Nov 24 01:53:44 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.
Nov 24 01:53:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5-userdata-shm.mount: Deactivated successfully.
Nov 24 01:53:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c-merged.mount: Deactivated successfully.
Nov 24 01:53:44 compute-0 podman[203236]: 2025-11-24 01:53:44.928163299 +0000 UTC m=+0.699101851 container cleanup 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal)
Nov 24 01:53:44 compute-0 podman[203236]: openstack_network_exporter
Nov 24 01:53:44 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 01:53:45 compute-0 podman[203263]: openstack_network_exporter
Nov 24 01:53:45 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 24 01:53:45 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 24 01:53:45 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 24 01:53:45 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:53:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a7f8bf12d978956a8cce985cd4f6f97637bacdc9f7d4122fa97097d3fcf42c/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 01:53:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.
Nov 24 01:53:45 compute-0 podman[203276]: 2025-11-24 01:53:45.188316698 +0000 UTC m=+0.139695120 container init 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *bridge.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *coverage.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *datapath.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *iface.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *memory.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *ovnnorthd.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *ovn.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *ovsdbserver.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *pmd_perf.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *pmd_rxq.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: INFO    01:53:45 main.go:48: registering *vswitch.Collector
Nov 24 01:53:45 compute-0 openstack_network_exporter[203291]: NOTICE  01:53:45 main.go:76: listening on https://:9105/metrics
Nov 24 01:53:45 compute-0 podman[203276]: 2025-11-24 01:53:45.225110296 +0000 UTC m=+0.176488628 container start 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 24 01:53:45 compute-0 podman[203276]: openstack_network_exporter
Nov 24 01:53:45 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 24 01:53:45 compute-0 sudo[203230]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:45 compute-0 podman[203301]: 2025-11-24 01:53:45.32813288 +0000 UTC m=+0.083564261 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7)
Nov 24 01:53:45 compute-0 sudo[203473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehdudexxcctmwbahwxqtigpvbdessxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949225.5018728-833-72717222543750/AnsiballZ_find.py'
Nov 24 01:53:45 compute-0 sudo[203473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:46 compute-0 python3.9[203475]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 01:53:46 compute-0 sudo[203473]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:46 compute-0 sudo[203625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwflhjdwpxgjyaoltsforzgitnovwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949226.4333625-843-45041783585178/AnsiballZ_podman_container_info.py'
Nov 24 01:53:46 compute-0 sudo[203625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:47 compute-0 python3.9[203627]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 24 01:53:47 compute-0 sudo[203625]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:47 compute-0 sudo[203790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htopofoiizmepjkzyvwcmhmsyyzcngcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949227.487161-851-223234918097483/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:47 compute-0 sudo[203790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:48 compute-0 python3.9[203792]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:48 compute-0 systemd[1]: Started libpod-conmon-c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4.scope.
Nov 24 01:53:48 compute-0 podman[203793]: 2025-11-24 01:53:48.324294378 +0000 UTC m=+0.101289055 container exec c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 01:53:48 compute-0 podman[203793]: 2025-11-24 01:53:48.360291063 +0000 UTC m=+0.137285730 container exec_died c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 01:53:48 compute-0 systemd[1]: libpod-conmon-c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4.scope: Deactivated successfully.
Nov 24 01:53:48 compute-0 sudo[203790]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:53:48.412 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:53:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:53:48.415 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:53:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:53:48.415 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:53:48 compute-0 sudo[203975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htpgtpkgvdcvgrdokphozilruxagsise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949228.6105425-859-258689145960894/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:48 compute-0 sudo[203975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:49 compute-0 python3.9[203977]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:49 compute-0 systemd[1]: Started libpod-conmon-c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4.scope.
Nov 24 01:53:49 compute-0 podman[203978]: 2025-11-24 01:53:49.275180509 +0000 UTC m=+0.071384204 container exec c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:53:49 compute-0 podman[203978]: 2025-11-24 01:53:49.305612996 +0000 UTC m=+0.101816671 container exec_died c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:53:49 compute-0 systemd[1]: libpod-conmon-c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4.scope: Deactivated successfully.
Nov 24 01:53:49 compute-0 sudo[203975]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:49 compute-0 sudo[204159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkaoohadizgxnigtlrrbsccknwyihngc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949229.5294394-867-8829359580451/AnsiballZ_file.py'
Nov 24 01:53:49 compute-0 sudo[204159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:50 compute-0 python3.9[204161]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:50 compute-0 sudo[204159]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:50 compute-0 sudo[204322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabevzkgjzzmfkegikqvqmciblanhkld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949230.302361-876-109078810211369/AnsiballZ_podman_container_info.py'
Nov 24 01:53:50 compute-0 sudo[204322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:50 compute-0 podman[204285]: 2025-11-24 01:53:50.566579407 +0000 UTC m=+0.051070835 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 01:53:50 compute-0 python3.9[204335]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 24 01:53:50 compute-0 sudo[204322]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:51 compute-0 sudo[204498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfwogjhrrbqoeksyafyporkniwsopvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949231.0786934-884-121267170535723/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:51 compute-0 sudo[204498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:51 compute-0 python3.9[204500]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:51 compute-0 systemd[1]: Started libpod-conmon-ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d.scope.
Nov 24 01:53:51 compute-0 podman[204501]: 2025-11-24 01:53:51.785044518 +0000 UTC m=+0.093924986 container exec ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 01:53:51 compute-0 podman[204501]: 2025-11-24 01:53:51.820463037 +0000 UTC m=+0.129343485 container exec_died ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 24 01:53:51 compute-0 systemd[1]: libpod-conmon-ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d.scope: Deactivated successfully.
Nov 24 01:53:51 compute-0 sudo[204498]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:52 compute-0 sudo[204682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pckiexfjznmvjtidxwaoqsfyzdtwupfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949232.0086381-892-74083183189566/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:52 compute-0 sudo[204682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:52 compute-0 python3.9[204684]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:52 compute-0 systemd[1]: Started libpod-conmon-ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d.scope.
Nov 24 01:53:52 compute-0 podman[204685]: 2025-11-24 01:53:52.587326346 +0000 UTC m=+0.087611396 container exec ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 01:53:52 compute-0 podman[204685]: 2025-11-24 01:53:52.622375885 +0000 UTC m=+0.122660915 container exec_died ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 01:53:52 compute-0 systemd[1]: libpod-conmon-ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d.scope: Deactivated successfully.
Nov 24 01:53:52 compute-0 sudo[204682]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:53 compute-0 sudo[204866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzmzoptqkldhrrratjcljjdynggqpxld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949232.8297231-900-33001981013899/AnsiballZ_file.py'
Nov 24 01:53:53 compute-0 sudo[204866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:53 compute-0 python3.9[204868]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:53 compute-0 sudo[204866]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:54 compute-0 sudo[205018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqcoeyicmkawkpwvmqqgmfcawwctsbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949233.6560466-909-22338860093094/AnsiballZ_podman_container_info.py'
Nov 24 01:53:54 compute-0 sudo[205018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:54 compute-0 python3.9[205020]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 24 01:53:54 compute-0 sudo[205018]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:54 compute-0 sudo[205200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itnmfghsrpffbdwmraudclpqszfuxmeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949234.5009952-917-92366622275632/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:54 compute-0 sudo[205200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:54 compute-0 podman[205157]: 2025-11-24 01:53:54.811032786 +0000 UTC m=+0.060172945 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 01:53:55 compute-0 python3.9[205204]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:55 compute-0 systemd[1]: Started libpod-conmon-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.scope.
Nov 24 01:53:55 compute-0 podman[205205]: 2025-11-24 01:53:55.130306028 +0000 UTC m=+0.100907405 container exec 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Nov 24 01:53:55 compute-0 podman[205205]: 2025-11-24 01:53:55.16724564 +0000 UTC m=+0.137846957 container exec_died 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 24 01:53:55 compute-0 systemd[1]: libpod-conmon-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.scope: Deactivated successfully.
Nov 24 01:53:55 compute-0 sudo[205200]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:55 compute-0 sudo[205389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czhpbzxrngwtkutrpnryiuyiyrqalugm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949235.4314075-925-266389983319663/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:55 compute-0 sudo[205389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:55 compute-0 python3.9[205391]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:56 compute-0 systemd[1]: Started libpod-conmon-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.scope.
Nov 24 01:53:56 compute-0 podman[205392]: 2025-11-24 01:53:56.105613835 +0000 UTC m=+0.094079661 container exec 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Nov 24 01:53:56 compute-0 podman[205392]: 2025-11-24 01:53:56.14021326 +0000 UTC m=+0.128679046 container exec_died 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:53:56 compute-0 systemd[1]: libpod-conmon-493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6.scope: Deactivated successfully.
Nov 24 01:53:56 compute-0 sudo[205389]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:56 compute-0 sudo[205572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzqjnpgtepwamejqxdvnjgwzwpwfraqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949236.3905342-933-247124549538199/AnsiballZ_file.py'
Nov 24 01:53:56 compute-0 sudo[205572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:56 compute-0 python3.9[205574]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:53:56 compute-0 sudo[205572]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:57 compute-0 sudo[205724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnoenjfpddboslcvzennexgjkdnleix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949237.1515315-942-22508994128043/AnsiballZ_podman_container_info.py'
Nov 24 01:53:57 compute-0 sudo[205724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:57 compute-0 python3.9[205726]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 24 01:53:57 compute-0 sudo[205724]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:58 compute-0 sudo[205890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcclzsxolraquympfakdtlfsyqkfjoeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949238.022444-950-188300591439422/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:58 compute-0 sudo[205890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:58 compute-0 python3.9[205892]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:58 compute-0 systemd[1]: Started libpod-conmon-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope.
Nov 24 01:53:58 compute-0 podman[205893]: 2025-11-24 01:53:58.620509097 +0000 UTC m=+0.067741691 container exec 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 01:53:58 compute-0 podman[205893]: 2025-11-24 01:53:58.6497836 +0000 UTC m=+0.097016204 container exec_died 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:53:58 compute-0 systemd[1]: libpod-conmon-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope: Deactivated successfully.
Nov 24 01:53:58 compute-0 sudo[205890]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:59 compute-0 sudo[206074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhucmydnboyzkmktvozjmypozfbbpsev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949238.905479-958-147569896650035/AnsiballZ_podman_container_exec.py'
Nov 24 01:53:59 compute-0 sudo[206074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:53:59 compute-0 python3.9[206076]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:53:59 compute-0 systemd[1]: Started libpod-conmon-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope.
Nov 24 01:53:59 compute-0 podman[206077]: 2025-11-24 01:53:59.532346485 +0000 UTC m=+0.080103602 container exec 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:53:59 compute-0 podman[206077]: 2025-11-24 01:53:59.56728789 +0000 UTC m=+0.115044967 container exec_died 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 24 01:53:59 compute-0 systemd[1]: libpod-conmon-735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96.scope: Deactivated successfully.
Nov 24 01:53:59 compute-0 sudo[206074]: pam_unix(sudo:session): session closed for user root
Nov 24 01:53:59 compute-0 podman[206108]: 2025-11-24 01:53:59.710803918 +0000 UTC m=+0.076611333 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 01:54:00 compute-0 sudo[206275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gozrdgglrdgektrleslgscgpscsyqfoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949239.8091853-966-176248735148723/AnsiballZ_file.py'
Nov 24 01:54:00 compute-0 sudo[206275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:00 compute-0 python3.9[206277]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:00 compute-0 sudo[206275]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:00 compute-0 sudo[206427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcjgftekwzhknfjecojfzswkdheyubcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949240.5443668-975-68562594596894/AnsiballZ_podman_container_info.py'
Nov 24 01:54:00 compute-0 sudo[206427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:01 compute-0 python3.9[206429]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 24 01:54:01 compute-0 sudo[206427]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:01 compute-0 sudo[206611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezjthhgyfvrtbxoftlbwilihztsvokf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949241.3586595-983-204989074666193/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:01 compute-0 sudo[206611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:01 compute-0 podman[206566]: 2025-11-24 01:54:01.722038175 +0000 UTC m=+0.099046691 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:54:01 compute-0 python3.9[206617]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:01 compute-0 systemd[1]: Started libpod-conmon-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.scope.
Nov 24 01:54:02 compute-0 podman[206621]: 2025-11-24 01:54:02.007198677 +0000 UTC m=+0.085094775 container exec b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:54:02 compute-0 podman[206621]: 2025-11-24 01:54:02.042379629 +0000 UTC m=+0.120275707 container exec_died b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:54:02 compute-0 systemd[1]: libpod-conmon-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.scope: Deactivated successfully.
Nov 24 01:54:02 compute-0 sudo[206611]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:02 compute-0 sudo[206817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjquavnydgiecczunutabhjexzmgkygv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949242.26772-991-223770392871997/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:02 compute-0 sudo[206817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:02 compute-0 podman[206776]: 2025-11-24 01:54:02.634327807 +0000 UTC m=+0.102567822 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 01:54:02 compute-0 python3.9[206828]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:02 compute-0 systemd[1]: Started libpod-conmon-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.scope.
Nov 24 01:54:02 compute-0 podman[206829]: 2025-11-24 01:54:02.962140483 +0000 UTC m=+0.086328080 container exec b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:54:02 compute-0 podman[206829]: 2025-11-24 01:54:02.995102032 +0000 UTC m=+0.119289609 container exec_died b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 01:54:03 compute-0 systemd[1]: libpod-conmon-b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213.scope: Deactivated successfully.
Nov 24 01:54:03 compute-0 sudo[206817]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:03 compute-0 sudo[207010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmsfwvwyipkowxkmlzxmimgrdiqktkqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949243.214862-999-131516731448671/AnsiballZ_file.py'
Nov 24 01:54:03 compute-0 sudo[207010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:03 compute-0 python3.9[207012]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:03 compute-0 sudo[207010]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.120 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.136 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.136 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 sudo[207162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfniymapgfpcrkgmhfaluqqpmcitaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949243.8911963-1008-47480849532057/AnsiballZ_podman_container_info.py'
Nov 24 01:54:04 compute-0 sudo[207162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:04 compute-0 python3.9[207164]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 24 01:54:04 compute-0 sudo[207162]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.785 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.785 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.785 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 nova_compute[186999]: 2025-11-24 01:54:04.785 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:04 compute-0 sudo[207327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwttipvhywsrmuoyoxkfyocaooqjinfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949244.6415582-1016-52022092747873/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:04 compute-0 sudo[207327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:05 compute-0 python3.9[207329]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:05 compute-0 systemd[1]: Started libpod-conmon-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.scope.
Nov 24 01:54:05 compute-0 podman[207330]: 2025-11-24 01:54:05.211319618 +0000 UTC m=+0.084297192 container exec 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:54:05 compute-0 podman[207330]: 2025-11-24 01:54:05.241309232 +0000 UTC m=+0.114286786 container exec_died 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 01:54:05 compute-0 systemd[1]: libpod-conmon-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.scope: Deactivated successfully.
Nov 24 01:54:05 compute-0 sudo[207327]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:05 compute-0 sudo[207511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmrynenhnkuvvtflivhtnonxotaqszqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949245.465282-1024-103523209457869/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:05 compute-0 sudo[207511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.798 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.966 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.967 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5923MB free_disk=73.49715805053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.967 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:54:05 compute-0 nova_compute[186999]: 2025-11-24 01:54:05.968 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:54:05 compute-0 python3.9[207513]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.025 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.025 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.059 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:54:06 compute-0 systemd[1]: Started libpod-conmon-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.scope.
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.073 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.075 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:54:06 compute-0 nova_compute[186999]: 2025-11-24 01:54:06.075 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:54:06 compute-0 podman[207514]: 2025-11-24 01:54:06.082342614 +0000 UTC m=+0.074176483 container exec 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:54:06 compute-0 podman[207514]: 2025-11-24 01:54:06.112065491 +0000 UTC m=+0.103899340 container exec_died 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:54:06 compute-0 systemd[1]: libpod-conmon-82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b.scope: Deactivated successfully.
Nov 24 01:54:06 compute-0 sudo[207511]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:06 compute-0 sudo[207695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoldrkzhcqxzjfhrrlnkqqvjtxnwbtom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949246.3864522-1032-92212492998516/AnsiballZ_file.py'
Nov 24 01:54:06 compute-0 sudo[207695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:06 compute-0 python3.9[207697]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:06 compute-0 sudo[207695]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:07 compute-0 sudo[207847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlqabkxnjhdiovfkgjqmimxudizenqcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949247.1598341-1041-214255528192335/AnsiballZ_podman_container_info.py'
Nov 24 01:54:07 compute-0 sudo[207847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:07 compute-0 python3.9[207849]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 24 01:54:07 compute-0 sudo[207847]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:08 compute-0 sudo[208012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzgpoinrselhptmqaverjbogtsxflgon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949247.8390183-1049-228430426038610/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:08 compute-0 sudo[208012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:08 compute-0 python3.9[208014]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:08 compute-0 systemd[1]: Started libpod-conmon-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.scope.
Nov 24 01:54:08 compute-0 podman[208015]: 2025-11-24 01:54:08.400104263 +0000 UTC m=+0.065902718 container exec 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 24 01:54:08 compute-0 podman[208015]: 2025-11-24 01:54:08.433305588 +0000 UTC m=+0.099103983 container exec_died 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 24 01:54:08 compute-0 systemd[1]: libpod-conmon-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.scope: Deactivated successfully.
Nov 24 01:54:08 compute-0 sudo[208012]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:08 compute-0 sudo[208197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwbfgcvqjffvputdkxfxjeekvvjspck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949248.644184-1057-241320310182661/AnsiballZ_podman_container_exec.py'
Nov 24 01:54:08 compute-0 sudo[208197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:09 compute-0 python3.9[208199]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 01:54:09 compute-0 systemd[1]: Started libpod-conmon-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.scope.
Nov 24 01:54:09 compute-0 podman[208200]: 2025-11-24 01:54:09.197749628 +0000 UTC m=+0.080166634 container exec 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 24 01:54:09 compute-0 podman[208200]: 2025-11-24 01:54:09.232329833 +0000 UTC m=+0.114746869 container exec_died 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, vcs-type=git, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 24 01:54:09 compute-0 systemd[1]: libpod-conmon-47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5.scope: Deactivated successfully.
Nov 24 01:54:09 compute-0 sudo[208197]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:09 compute-0 sudo[208382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtpknqyeaqkakagafyojotgsgtkyrigv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949249.4698212-1065-120355874480475/AnsiballZ_file.py'
Nov 24 01:54:09 compute-0 sudo[208382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:09 compute-0 python3.9[208384]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:09 compute-0 sudo[208382]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:10 compute-0 sudo[208534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwsdhitajddpbboxsjiadkzlqsjhaaxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949250.189045-1074-73818803523631/AnsiballZ_file.py'
Nov 24 01:54:10 compute-0 sudo[208534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:10 compute-0 python3.9[208536]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:10 compute-0 sudo[208534]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:11 compute-0 sudo[208702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiueoiumnanuuujpkepplquyhwhpcbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949250.871677-1082-10538132176965/AnsiballZ_stat.py'
Nov 24 01:54:11 compute-0 sudo[208702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:11 compute-0 podman[208660]: 2025-11-24 01:54:11.238612541 +0000 UTC m=+0.097810877 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 24 01:54:11 compute-0 python3.9[208708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:11 compute-0 sudo[208702]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:11 compute-0 sudo[208830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykgsvepfkhyattvuzvuacbeiixohvbke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949250.871677-1082-10538132176965/AnsiballZ_copy.py'
Nov 24 01:54:11 compute-0 sudo[208830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:11 compute-0 python3.9[208832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763949250.871677-1082-10538132176965/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:11 compute-0 sudo[208830]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:12 compute-0 sudo[208982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vllxxyrvzjmbhvzsiizcrwzzbgjwgfbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949252.1976433-1098-22261147688600/AnsiballZ_file.py'
Nov 24 01:54:12 compute-0 sudo[208982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:12 compute-0 python3.9[208984]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:12 compute-0 sudo[208982]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:13 compute-0 sudo[209134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqvehhqgxinoxouigfknqrwueqrzbtdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949252.8995867-1106-81893208984183/AnsiballZ_stat.py'
Nov 24 01:54:13 compute-0 sudo[209134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:13 compute-0 python3.9[209136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:13 compute-0 sudo[209134]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:13 compute-0 sudo[209212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqpqirkcthmnqzlhvacjcxfjshiiqysd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949252.8995867-1106-81893208984183/AnsiballZ_file.py'
Nov 24 01:54:13 compute-0 sudo[209212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:13 compute-0 python3.9[209214]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:13 compute-0 sudo[209212]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:14 compute-0 sudo[209366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgtotmxlkagyugmyqmvpgeuhrfdasnco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949254.0730205-1118-27439554712180/AnsiballZ_stat.py'
Nov 24 01:54:14 compute-0 sudo[209366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:14 compute-0 python3.9[209368]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:14 compute-0 sudo[209366]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:14 compute-0 sudo[209444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufgoumqaojcupyxijfvnczrcofjxwwbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949254.0730205-1118-27439554712180/AnsiballZ_file.py'
Nov 24 01:54:14 compute-0 sudo[209444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:14 compute-0 sshd-session[209276]: Received disconnect from 46.188.119.26 port 35122:11: Bye Bye [preauth]
Nov 24 01:54:14 compute-0 sshd-session[209276]: Disconnected from authenticating user root 46.188.119.26 port 35122 [preauth]
Nov 24 01:54:15 compute-0 python3.9[209446]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.m8zxp9_w recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:15 compute-0 sudo[209444]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:15 compute-0 sudo[209606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flurqewtwzkufcknpxfkpyuefauhvmsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949255.2540274-1130-222432151018107/AnsiballZ_stat.py'
Nov 24 01:54:15 compute-0 sudo[209606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:15 compute-0 podman[209570]: 2025-11-24 01:54:15.617695014 +0000 UTC m=+0.116078906 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 01:54:15 compute-0 python3.9[209614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:15 compute-0 sudo[209606]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:16 compute-0 sudo[209693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twdiwiuqjskqiqxgfzswgbxhwdfynihw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949255.2540274-1130-222432151018107/AnsiballZ_file.py'
Nov 24 01:54:16 compute-0 sudo[209693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:16 compute-0 python3.9[209695]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:16 compute-0 sudo[209693]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:16 compute-0 sudo[209845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbtoiuehrgdybzqmwxpiujbxzemohcns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949256.441552-1143-37766744296597/AnsiballZ_command.py'
Nov 24 01:54:16 compute-0 sudo[209845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:16 compute-0 python3.9[209847]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:54:16 compute-0 sudo[209845]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:17 compute-0 sudo[209998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmmtrzuddxduluxxhkfmulvvgftrxhpd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763949257.1234665-1151-24237315732700/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 01:54:17 compute-0 sudo[209998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:17 compute-0 python3[210000]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 01:54:17 compute-0 sudo[209998]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:18 compute-0 sudo[210150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbytwokttpckkrynofsirsmqvlzrmtuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949258.065824-1159-221316119574337/AnsiballZ_stat.py'
Nov 24 01:54:18 compute-0 sudo[210150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:18 compute-0 python3.9[210152]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:18 compute-0 sudo[210150]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:18 compute-0 sudo[210228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgyysewkjapjqffdgyqxclovvhrbcyfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949258.065824-1159-221316119574337/AnsiballZ_file.py'
Nov 24 01:54:18 compute-0 sudo[210228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:19 compute-0 python3.9[210230]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:19 compute-0 sudo[210228]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:19 compute-0 sudo[210380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxtesgxmdacbyrrgqgbrmngaxusyzgbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949259.2078052-1171-152755496252054/AnsiballZ_stat.py'
Nov 24 01:54:19 compute-0 sudo[210380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:19 compute-0 python3.9[210382]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:19 compute-0 sudo[210380]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:20 compute-0 sudo[210458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpvfzjuhdmuzprfshgfzbcgeoifdfieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949259.2078052-1171-152755496252054/AnsiballZ_file.py'
Nov 24 01:54:20 compute-0 sudo[210458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:20 compute-0 python3.9[210460]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:20 compute-0 sudo[210458]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:20 compute-0 podman[210584]: 2025-11-24 01:54:20.822570215 +0000 UTC m=+0.065731193 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:54:20 compute-0 sudo[210628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfrggsekroirncpumpfmewltcxxauirc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949260.500859-1183-247304792712628/AnsiballZ_stat.py'
Nov 24 01:54:20 compute-0 sudo[210628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:21 compute-0 python3.9[210637]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:21 compute-0 sudo[210628]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:21 compute-0 sudo[210714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbywkjcxcwtgvbtcaguyvsrlejnffqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949260.500859-1183-247304792712628/AnsiballZ_file.py'
Nov 24 01:54:21 compute-0 sudo[210714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:21 compute-0 python3.9[210716]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:21 compute-0 sudo[210714]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:22 compute-0 sudo[210866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhbepurzlnespxdsqquqiupboczbyfwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949261.9842415-1195-151134418995019/AnsiballZ_stat.py'
Nov 24 01:54:22 compute-0 sudo[210866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:22 compute-0 python3.9[210868]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:22 compute-0 sudo[210866]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:22 compute-0 sudo[210944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnsoltuaybqsssugykiglehpufhnynzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949261.9842415-1195-151134418995019/AnsiballZ_file.py'
Nov 24 01:54:22 compute-0 sudo[210944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:22 compute-0 python3.9[210946]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:22 compute-0 sudo[210944]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:23 compute-0 sudo[211096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udzgltixmzgcwsrymkvylwksdqnzuser ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949263.0868564-1207-19621780492015/AnsiballZ_stat.py'
Nov 24 01:54:23 compute-0 sudo[211096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:23 compute-0 python3.9[211098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 01:54:23 compute-0 sudo[211096]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:23 compute-0 sudo[211221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otkszefdmlhyjmruuklzgcrghzejyuqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949263.0868564-1207-19621780492015/AnsiballZ_copy.py'
Nov 24 01:54:23 compute-0 sudo[211221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:24 compute-0 python3.9[211223]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763949263.0868564-1207-19621780492015/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:24 compute-0 sudo[211221]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:24 compute-0 sudo[211373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fowimklhmgzupxlpkbuylrzmmylmfsqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949264.361043-1222-111862130988728/AnsiballZ_file.py'
Nov 24 01:54:24 compute-0 sudo[211373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:24 compute-0 python3.9[211375]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:24 compute-0 sudo[211373]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:25 compute-0 sudo[211542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmbfayezhrfnazgamkdozffzqsxgdbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949265.0692637-1230-54717002103081/AnsiballZ_command.py'
Nov 24 01:54:25 compute-0 podman[211499]: 2025-11-24 01:54:25.50333779 +0000 UTC m=+0.049521222 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 24 01:54:25 compute-0 sudo[211542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:25 compute-0 python3.9[211546]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:54:25 compute-0 sudo[211542]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:26 compute-0 sudo[211699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oihmqurgyxbqeelrrvvrmynhkbocrdsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949265.9068866-1238-81865601741377/AnsiballZ_blockinfile.py'
Nov 24 01:54:26 compute-0 sudo[211699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:26 compute-0 python3.9[211701]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:26 compute-0 sudo[211699]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:27 compute-0 sudo[211851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfkrgugnbrrpnsublwmilojsdqjbuxmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949266.852536-1247-137505223617537/AnsiballZ_command.py'
Nov 24 01:54:27 compute-0 sudo[211851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:27 compute-0 python3.9[211853]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:54:27 compute-0 sudo[211851]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:27 compute-0 sudo[212004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upfrnphqjzqnxdslmcifldnlxxrcvrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949267.5838983-1255-138555277221465/AnsiballZ_stat.py'
Nov 24 01:54:27 compute-0 sudo[212004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:28 compute-0 python3.9[212006]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 01:54:28 compute-0 sudo[212004]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:28 compute-0 sudo[212158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcsuhykxfhltemqlfhihsjizhxgqlyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949268.3215847-1263-174744802494696/AnsiballZ_command.py'
Nov 24 01:54:28 compute-0 sudo[212158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:28 compute-0 python3.9[212160]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 01:54:28 compute-0 sudo[212158]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:29 compute-0 sudo[212313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eonvfzoqcynzykpcoegbhvckovvhksyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763949269.2006316-1271-1988133915038/AnsiballZ_file.py'
Nov 24 01:54:29 compute-0 sudo[212313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 01:54:29 compute-0 python3.9[212315]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 01:54:29 compute-0 sudo[212313]: pam_unix(sudo:session): session closed for user root
Nov 24 01:54:29 compute-0 podman[212321]: 2025-11-24 01:54:29.801696917 +0000 UTC m=+0.061391612 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 01:54:30 compute-0 sshd-session[187302]: Connection closed by 192.168.122.30 port 43062
Nov 24 01:54:30 compute-0 sshd-session[187299]: pam_unix(sshd:session): session closed for user zuul
Nov 24 01:54:30 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 24 01:54:30 compute-0 systemd[1]: session-25.scope: Consumed 1min 48.389s CPU time.
Nov 24 01:54:30 compute-0 systemd-logind[791]: Session 25 logged out. Waiting for processes to exit.
Nov 24 01:54:30 compute-0 systemd-logind[791]: Removed session 25.
Nov 24 01:54:32 compute-0 podman[212360]: 2025-11-24 01:54:32.805677789 +0000 UTC m=+0.055825567 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:54:32 compute-0 podman[212361]: 2025-11-24 01:54:32.83325714 +0000 UTC m=+0.081909934 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 01:54:37 compute-0 sshd-session[212409]: Received disconnect from 80.94.93.233 port 36864:11:  [preauth]
Nov 24 01:54:37 compute-0 sshd-session[212409]: Disconnected from authenticating user root 80.94.93.233 port 36864 [preauth]
Nov 24 01:54:41 compute-0 podman[212411]: 2025-11-24 01:54:41.819929777 +0000 UTC m=+0.072799790 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 24 01:54:44 compute-0 sshd-session[212430]: Received disconnect from 154.90.59.75 port 32790:11: Bye Bye [preauth]
Nov 24 01:54:44 compute-0 sshd-session[212430]: Disconnected from authenticating user root 154.90.59.75 port 32790 [preauth]
Nov 24 01:54:45 compute-0 podman[212432]: 2025-11-24 01:54:45.840949386 +0000 UTC m=+0.083016299 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 24 01:54:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:54:48.413 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:54:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:54:48.414 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:54:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:54:48.414 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:54:51 compute-0 podman[212453]: 2025-11-24 01:54:51.806760432 +0000 UTC m=+0.063040450 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 01:54:55 compute-0 podman[212477]: 2025-11-24 01:54:55.817854748 +0000 UTC m=+0.064003711 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 01:55:00 compute-0 podman[212497]: 2025-11-24 01:55:00.817962031 +0000 UTC m=+0.064198626 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 24 01:55:03 compute-0 podman[212518]: 2025-11-24 01:55:03.799833595 +0000 UTC m=+0.055116641 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 01:55:03 compute-0 podman[212519]: 2025-11-24 01:55:03.837790829 +0000 UTC m=+0.087869518 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.076 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 01:55:04 compute-0 nova_compute[186999]: 2025-11-24 01:55:04.788 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.797 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.798 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.968 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.970 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6023MB free_disk=73.49641036987305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.970 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:55:05 compute-0 nova_compute[186999]: 2025-11-24 01:55:05.971 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.074 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.075 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.098 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.111 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.113 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:55:06 compute-0 nova_compute[186999]: 2025-11-24 01:55:06.114 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:55:07 compute-0 nova_compute[186999]: 2025-11-24 01:55:07.110 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:07 compute-0 nova_compute[186999]: 2025-11-24 01:55:07.110 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:07 compute-0 nova_compute[186999]: 2025-11-24 01:55:07.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:55:07 compute-0 nova_compute[186999]: 2025-11-24 01:55:07.773 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:55:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:10.724 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:55:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:10.726 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 01:55:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:10.727 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:55:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 01:55:11 compute-0 sshd-session[212517]: Invalid user admin from 68.210.96.117 port 39566
Nov 24 01:55:12 compute-0 sshd-session[212517]: Connection closed by invalid user admin 68.210.96.117 port 39566 [preauth]
Nov 24 01:55:12 compute-0 podman[212569]: 2025-11-24 01:55:12.812161449 +0000 UTC m=+0.066305933 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 01:55:16 compute-0 sshd-session[212590]: Invalid user telecomadmin from 80.94.95.116 port 33878
Nov 24 01:55:16 compute-0 podman[212592]: 2025-11-24 01:55:16.481637999 +0000 UTC m=+0.060807439 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 24 01:55:16 compute-0 sshd-session[212590]: Connection closed by invalid user telecomadmin 80.94.95.116 port 33878 [preauth]
Nov 24 01:55:22 compute-0 podman[212614]: 2025-11-24 01:55:22.819788287 +0000 UTC m=+0.068039012 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:55:26 compute-0 podman[212640]: 2025-11-24 01:55:26.825704286 +0000 UTC m=+0.071641414 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:55:27 compute-0 sshd-session[212638]: Invalid user albert from 46.188.119.26 port 35454
Nov 24 01:55:27 compute-0 sshd-session[212638]: Received disconnect from 46.188.119.26 port 35454:11: Bye Bye [preauth]
Nov 24 01:55:27 compute-0 sshd-session[212638]: Disconnected from invalid user albert 46.188.119.26 port 35454 [preauth]
Nov 24 01:55:31 compute-0 podman[212659]: 2025-11-24 01:55:31.801129765 +0000 UTC m=+0.057381102 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 01:55:34 compute-0 podman[212681]: 2025-11-24 01:55:34.815415913 +0000 UTC m=+0.072233922 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:55:34 compute-0 podman[212682]: 2025-11-24 01:55:34.839811572 +0000 UTC m=+0.090830327 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:55:43 compute-0 podman[212731]: 2025-11-24 01:55:43.814943863 +0000 UTC m=+0.071261524 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118)
Nov 24 01:55:46 compute-0 podman[212751]: 2025-11-24 01:55:46.820947748 +0000 UTC m=+0.070133622 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=)
Nov 24 01:55:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:48.415 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:55:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:48.415 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:55:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:55:48.416 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:55:53 compute-0 podman[212773]: 2025-11-24 01:55:53.797180135 +0000 UTC m=+0.046382111 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:55:57 compute-0 podman[212798]: 2025-11-24 01:55:57.804158185 +0000 UTC m=+0.048914932 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 01:56:02 compute-0 sshd-session[212817]: Received disconnect from 154.90.59.75 port 43740:11: Bye Bye [preauth]
Nov 24 01:56:02 compute-0 sshd-session[212817]: Disconnected from authenticating user daemon 154.90.59.75 port 43740 [preauth]
Nov 24 01:56:02 compute-0 podman[212819]: 2025-11-24 01:56:02.836681505 +0000 UTC m=+0.076842622 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 01:56:04 compute-0 nova_compute[186999]: 2025-11-24 01:56:04.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.768 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.782 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.782 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.798 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 01:56:05 compute-0 nova_compute[186999]: 2025-11-24 01:56:05.799 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:05 compute-0 podman[212841]: 2025-11-24 01:56:05.806474256 +0000 UTC m=+0.054993525 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:56:05 compute-0 podman[212842]: 2025-11-24 01:56:05.895798349 +0000 UTC m=+0.132186175 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.799 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.799 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.800 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:56:06 compute-0 nova_compute[186999]: 2025-11-24 01:56:06.800 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.008 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.009 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6062MB free_disk=73.49628448486328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.009 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.010 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.068 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.068 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.101 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.114 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.116 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:56:07 compute-0 nova_compute[186999]: 2025-11-24 01:56:07.116 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:56:08 compute-0 nova_compute[186999]: 2025-11-24 01:56:08.116 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:08 compute-0 nova_compute[186999]: 2025-11-24 01:56:08.117 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:08 compute-0 nova_compute[186999]: 2025-11-24 01:56:08.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:08 compute-0 nova_compute[186999]: 2025-11-24 01:56:08.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:56:08 compute-0 nova_compute[186999]: 2025-11-24 01:56:08.770 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:56:14 compute-0 podman[212890]: 2025-11-24 01:56:14.812740809 +0000 UTC m=+0.064942506 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 01:56:17 compute-0 podman[212909]: 2025-11-24 01:56:17.80520696 +0000 UTC m=+0.058338039 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 01:56:24 compute-0 podman[212931]: 2025-11-24 01:56:24.834843024 +0000 UTC m=+0.088361557 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 01:56:28 compute-0 podman[212956]: 2025-11-24 01:56:28.827263414 +0000 UTC m=+0.083848540 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:56:33 compute-0 podman[212976]: 2025-11-24 01:56:33.805763179 +0000 UTC m=+0.062352262 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 01:56:36 compute-0 podman[212996]: 2025-11-24 01:56:36.832605104 +0000 UTC m=+0.084829447 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:56:36 compute-0 podman[212997]: 2025-11-24 01:56:36.866948247 +0000 UTC m=+0.111267134 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 01:56:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:43.741 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:56:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:43.742 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 01:56:44 compute-0 sshd-session[213048]: Received disconnect from 46.188.119.26 port 35794:11: Bye Bye [preauth]
Nov 24 01:56:44 compute-0 sshd-session[213048]: Disconnected from authenticating user root 46.188.119.26 port 35794 [preauth]
Nov 24 01:56:45 compute-0 podman[213050]: 2025-11-24 01:56:45.796596162 +0000 UTC m=+0.055510329 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 01:56:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:48.416 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:48.416 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:48.416 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:56:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:56:48.743 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:56:48 compute-0 podman[213071]: 2025-11-24 01:56:48.834991647 +0000 UTC m=+0.087898804 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 01:56:55 compute-0 podman[213093]: 2025-11-24 01:56:55.816582763 +0000 UTC m=+0.065117033 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.570 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.571 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.591 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.748 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.750 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.759 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.759 187003 INFO nova.compute.claims [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Claim successful on node compute-0.ctlplane.example.com
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.885 187003 DEBUG nova.compute.provider_tree [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.894 187003 DEBUG nova.scheduler.client.report [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.911 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.911 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.945 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.945 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.966 187003 INFO nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 01:56:58 compute-0 nova_compute[186999]: 2025-11-24 01:56:58.987 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.085 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.087 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.088 187003 INFO nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Creating image(s)
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.089 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.089 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.091 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.091 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.092 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.618 187003 WARNING oslo_policy.policy [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.619 187003 WARNING oslo_policy.policy [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 01:56:59 compute-0 nova_compute[186999]: 2025-11-24 01:56:59.623 187003 DEBUG nova.policy [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 01:56:59 compute-0 podman[213119]: 2025-11-24 01:56:59.814475568 +0000 UTC m=+0.060967395 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.404 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.499 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.part --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.501 187003 DEBUG nova.virt.images [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] b6697012-8086-43d5-999a-6bb711240eaa was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.504 187003 DEBUG nova.privsep.utils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.505 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.part /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.681 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.part /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.converted" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.689 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.782 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1.converted --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.784 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:00 compute-0 nova_compute[186999]: 2025-11-24 01:57:00.809 187003 INFO oslo.privsep.daemon [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp6zuhtygo/privsep.sock']
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.527 187003 INFO oslo.privsep.daemon [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Spawned new privsep daemon via rootwrap
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.405 213157 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.413 213157 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.417 213157 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.418 213157 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213157
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.643 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.698 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.699 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.700 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.711 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.729 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Successfully created port: b1da3c12-e629-4325-be7d-3295c80a73da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.765 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.766 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.800 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.801 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.802 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.854 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.855 187003 DEBUG nova.virt.disk.api [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.855 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.913 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.915 187003 DEBUG nova.virt.disk.api [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.915 187003 DEBUG nova.objects.instance [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid a4dcff35-86ac-46bc-939c-dc6316ffd80f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.936 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.937 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Ensure instance console log exists: /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.937 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.938 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:01 compute-0 nova_compute[186999]: 2025-11-24 01:57:01.938 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:02 compute-0 nova_compute[186999]: 2025-11-24 01:57:02.972 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Successfully updated port: b1da3c12-e629-4325-be7d-3295c80a73da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 01:57:02 compute-0 nova_compute[186999]: 2025-11-24 01:57:02.990 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:57:02 compute-0 nova_compute[186999]: 2025-11-24 01:57:02.990 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:57:02 compute-0 nova_compute[186999]: 2025-11-24 01:57:02.990 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.425 187003 DEBUG nova.compute.manager [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.425 187003 DEBUG nova.compute.manager [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing instance network info cache due to event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.426 187003 DEBUG oslo_concurrency.lockutils [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.565 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.789 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.789 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 01:57:03 compute-0 nova_compute[186999]: 2025-11-24 01:57:03.800 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.594 187003 DEBUG nova.network.neutron [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.622 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.623 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Instance network_info: |[{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.624 187003 DEBUG oslo_concurrency.lockutils [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.624 187003 DEBUG nova.network.neutron [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.631 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Start _get_guest_xml network_info=[{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.642 187003 WARNING nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.652 187003 DEBUG nova.virt.libvirt.host [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.653 187003 DEBUG nova.virt.libvirt.host [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.657 187003 DEBUG nova.virt.libvirt.host [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.658 187003 DEBUG nova.virt.libvirt.host [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.659 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.660 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.660 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.661 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.661 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.661 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.662 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.662 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.662 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.663 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.663 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.663 187003 DEBUG nova.virt.hardware [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.668 187003 DEBUG nova.privsep.utils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.669 187003 DEBUG nova.virt.libvirt.vif [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:56:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-176359546',display_name='tempest-TestNetworkBasicOps-server-176359546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-176359546',id=1,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI7K6FXfe3f7i9M+Lrq9UdTuwMlrNMPu9xzDTj17VZVDGOtmQOma9x4vWlM1AXFT60jK8li/Bc1daG4yB3t2WpOteUGAiwqwPlxKCKEKz4j8h3i95vSMtJhtNQ0wmbjHlA==',key_name='tempest-TestNetworkBasicOps-1719143779',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-4mazle30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:56:59Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a4dcff35-86ac-46bc-939c-dc6316ffd80f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.670 187003 DEBUG nova.network.os_vif_util [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.671 187003 DEBUG nova.network.os_vif_util [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.673 187003 DEBUG nova.objects.instance [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid a4dcff35-86ac-46bc-939c-dc6316ffd80f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.687 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] End _get_guest_xml xml=<domain type="kvm">
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <uuid>a4dcff35-86ac-46bc-939c-dc6316ffd80f</uuid>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <name>instance-00000001</name>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-176359546</nova:name>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 01:57:04</nova:creationTime>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         <nova:port uuid="b1da3c12-e629-4325-be7d-3295c80a73da">
Nov 24 01:57:04 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <system>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="serial">a4dcff35-86ac-46bc-939c-dc6316ffd80f</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="uuid">a4dcff35-86ac-46bc-939c-dc6316ffd80f</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </system>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <os>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </os>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <features>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </features>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.config"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:92:3f:6b"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <target dev="tapb1da3c12-e6"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/console.log" append="off"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <video>
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </video>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 01:57:04 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 01:57:04 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:57:04 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:57:04 compute-0 nova_compute[186999]: </domain>
Nov 24 01:57:04 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.689 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Preparing to wait for external event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.690 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.690 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.691 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.692 187003 DEBUG nova.virt.libvirt.vif [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:56:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-176359546',display_name='tempest-TestNetworkBasicOps-server-176359546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-176359546',id=1,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI7K6FXfe3f7i9M+Lrq9UdTuwMlrNMPu9xzDTj17VZVDGOtmQOma9x4vWlM1AXFT60jK8li/Bc1daG4yB3t2WpOteUGAiwqwPlxKCKEKz4j8h3i95vSMtJhtNQ0wmbjHlA==',key_name='tempest-TestNetworkBasicOps-1719143779',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-4mazle30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:56:59Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a4dcff35-86ac-46bc-939c-dc6316ffd80f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.692 187003 DEBUG nova.network.os_vif_util [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.694 187003 DEBUG nova.network.os_vif_util [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.694 187003 DEBUG os_vif [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.799 187003 DEBUG ovsdbapp.backend.ovs_idl [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.799 187003 DEBUG ovsdbapp.backend.ovs_idl [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.800 187003 DEBUG ovsdbapp.backend.ovs_idl [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.801 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.803 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.803 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.804 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.806 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.808 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.809 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.820 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.820 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.821 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:57:04 compute-0 nova_compute[186999]: 2025-11-24 01:57:04.822 187003 INFO oslo.privsep.daemon [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpbkay2tzf/privsep.sock']
Nov 24 01:57:04 compute-0 podman[213174]: 2025-11-24 01:57:04.859070344 +0000 UTC m=+0.111126389 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.466 187003 INFO oslo.privsep.daemon [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Spawned new privsep daemon via rootwrap
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.352 213200 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.356 213200 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.357 213200 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.358 213200 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213200
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.783 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.783 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1da3c12-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.784 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1da3c12-e6, col_values=(('external_ids', {'iface-id': 'b1da3c12-e629-4325-be7d-3295c80a73da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:3f:6b', 'vm-uuid': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.785 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:05 compute-0 NetworkManager[55458]: <info>  [1763949425.7866] manager: (tapb1da3c12-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.789 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.789 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.790 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.799 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.800 187003 INFO os_vif [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6')
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.850 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.850 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.851 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:92:3f:6b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.851 187003 INFO nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Using config drive
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.857 187003 DEBUG nova.network.neutron [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updated VIF entry in instance network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.857 187003 DEBUG nova.network.neutron [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:57:05 compute-0 nova_compute[186999]: 2025-11-24 01:57:05.897 187003 DEBUG oslo_concurrency.lockutils [req-3722ce8d-dd22-4817-83d3-c9871e710dcb req-ae18f896-4533-403e-9168-ede3e6e04496 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.392 187003 INFO nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Creating config drive at /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.config
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.401 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69qfx93_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.540 187003 DEBUG oslo_concurrency.processutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69qfx93_" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.594 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:06 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 24 01:57:06 compute-0 kernel: tapb1da3c12-e6: entered promiscuous mode
Nov 24 01:57:06 compute-0 NetworkManager[55458]: <info>  [1763949426.6491] manager: (tapb1da3c12-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.650 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:06 compute-0 ovn_controller[95380]: 2025-11-24T01:57:06Z|00027|binding|INFO|Claiming lport b1da3c12-e629-4325-be7d-3295c80a73da for this chassis.
Nov 24 01:57:06 compute-0 ovn_controller[95380]: 2025-11-24T01:57:06Z|00028|binding|INFO|b1da3c12-e629-4325-be7d-3295c80a73da: Claiming fa:16:3e:92:3f:6b 10.100.0.4
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.656 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:06.682 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:3f:6b 10.100.0.4'], port_security=['fa:16:3e:92:3f:6b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56b453c1-ee78-40be-9431-0afc399d7dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87cc0efe-9bdc-4b2d-8d1b-45269d7fdc68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42306af5-a21b-4874-a8e2-8ada30faaa43, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=b1da3c12-e629-4325-be7d-3295c80a73da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:57:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:06.684 104238 INFO neutron.agent.ovn.metadata.agent [-] Port b1da3c12-e629-4325-be7d-3295c80a73da in datapath 56b453c1-ee78-40be-9431-0afc399d7dbc bound to our chassis
Nov 24 01:57:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:06.687 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 56b453c1-ee78-40be-9431-0afc399d7dbc
Nov 24 01:57:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:06.690 104238 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpo4s04azn/privsep.sock']
Nov 24 01:57:06 compute-0 systemd-udevd[213226]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:57:06 compute-0 NetworkManager[55458]: <info>  [1763949426.7391] device (tapb1da3c12-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:57:06 compute-0 NetworkManager[55458]: <info>  [1763949426.7405] device (tapb1da3c12-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 01:57:06 compute-0 systemd-machined[153319]: New machine qemu-1-instance-00000001.
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.768 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:06 compute-0 ovn_controller[95380]: 2025-11-24T01:57:06Z|00029|binding|INFO|Setting lport b1da3c12-e629-4325-be7d-3295c80a73da ovn-installed in OVS
Nov 24 01:57:06 compute-0 ovn_controller[95380]: 2025-11-24T01:57:06Z|00030|binding|INFO|Setting lport b1da3c12-e629-4325-be7d-3295c80a73da up in Southbound
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.775 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:06 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.797 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.798 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.907 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.998 187003 DEBUG nova.compute.manager [req-c071c973-6330-4bf1-8135-c5e7c06153ac req-40500909-8fd6-4c45-a38e-7b7883533335 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.999 187003 DEBUG oslo_concurrency.lockutils [req-c071c973-6330-4bf1-8135-c5e7c06153ac req-40500909-8fd6-4c45-a38e-7b7883533335 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:06 compute-0 nova_compute[186999]: 2025-11-24 01:57:06.999 187003 DEBUG oslo_concurrency.lockutils [req-c071c973-6330-4bf1-8135-c5e7c06153ac req-40500909-8fd6-4c45-a38e-7b7883533335 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.000 187003 DEBUG oslo_concurrency.lockutils [req-c071c973-6330-4bf1-8135-c5e7c06153ac req-40500909-8fd6-4c45-a38e-7b7883533335 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.000 187003 DEBUG nova.compute.manager [req-c071c973-6330-4bf1-8135-c5e7c06153ac req-40500909-8fd6-4c45-a38e-7b7883533335 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Processing event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.007 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.008 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.068 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.191 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.193 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949427.1919165, a4dcff35-86ac-46bc-939c-dc6316ffd80f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.193 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] VM Started (Lifecycle Event)
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.196 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.199 187003 INFO nova.virt.libvirt.driver [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Instance spawned successfully.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.199 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.224 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.227 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.238 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.239 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5931MB free_disk=73.46137237548828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.239 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.239 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.252 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.253 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949427.1920211, a4dcff35-86ac-46bc-939c-dc6316ffd80f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.253 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] VM Paused (Lifecycle Event)
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.271 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.279 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.279 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.280 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.280 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.280 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.281 187003 DEBUG nova.virt.libvirt.driver [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.286 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949427.2052796, a4dcff35-86ac-46bc-939c-dc6316ffd80f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.286 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] VM Resumed (Lifecycle Event)
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.306 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.308 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.326 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.331 187003 INFO nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Took 8.25 seconds to spawn the instance on the hypervisor.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.331 187003 DEBUG nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.333 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance a4dcff35-86ac-46bc-939c-dc6316ffd80f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.333 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.333 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.403 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing inventories for resource provider f28f14d1-2972-450a-b67e-0899e7918234 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.420 187003 INFO nova.compute.manager [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Took 8.72 seconds to build instance.
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.440 187003 DEBUG oslo_concurrency.lockutils [None req-7148e40a-c72c-4ba7-9260-c683c0a50125 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.458 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating ProviderTree inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.462 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.485 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing aggregate associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.503 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing trait associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, traits: COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AESNI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,HW_CPU_X86_ABM,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.526 104238 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.527 104238 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpo4s04azn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.348 213256 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.353 213256 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.355 213256 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.355 213256 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213256
Nov 24 01:57:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:07.530 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[361ac09b-85a1-4797-ae3e-75537eacb186]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.541 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.576 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updated inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.577 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating resource provider f28f14d1-2972-450a-b67e-0899e7918234 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.577 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.594 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:57:07 compute-0 nova_compute[186999]: 2025-11-24 01:57:07.595 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:07 compute-0 podman[213261]: 2025-11-24 01:57:07.840920964 +0000 UTC m=+0.086199806 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 01:57:07 compute-0 podman[213262]: 2025-11-24 01:57:07.876429749 +0000 UTC m=+0.110956264 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.055 213256 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.055 213256 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.055 213256 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.593 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6721e603-9a14-4575-8b56-ad6bc860eee5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.596 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap56b453c1-e1 in ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 01:57:08 compute-0 nova_compute[186999]: 2025-11-24 01:57:08.596 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.599 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap56b453c1-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.599 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[79841f5d-92f0-47ef-8117-a29b7019bc98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.603 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ac626eb7-c6b7-4453-92df-31e1d9b491ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.633 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[b64686ca-bf5e-4167-87e9-f1ddd0c5b1c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.663 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d40524-29f7-4340-838d-4ea2579dfdc7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:08.667 104238 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpg4_4383c/privsep.sock']
Nov 24 01:57:08 compute-0 nova_compute[186999]: 2025-11-24 01:57:08.768 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.100 187003 DEBUG nova.compute.manager [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.100 187003 DEBUG oslo_concurrency.lockutils [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.101 187003 DEBUG oslo_concurrency.lockutils [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.101 187003 DEBUG oslo_concurrency.lockutils [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.101 187003 DEBUG nova.compute.manager [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] No waiting events found dispatching network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.101 187003 WARNING nova.compute.manager [req-7f1dbb15-a030-4559-a57a-37731bd6a47f req-50b05e3c-a52b-43c7-84ac-63230e1caa68 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received unexpected event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da for instance with vm_state active and task_state None.
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.413 104238 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.417 104238 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpg4_4383c/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.272 213319 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.276 213319 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.278 213319 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.279 213319 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213319
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.420 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[b747ba58-fa95-4756-98ab-aa902997320b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:09 compute-0 nova_compute[186999]: 2025-11-24 01:57:09.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.910 213319 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.911 213319 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:09.911 213319 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.529 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[30f8e1a9-df52-4bb9-9550-a7de180d4515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.551 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a540c169-690e-46e9-9a5e-77ac2a89ff18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5613] manager: (tap56b453c1-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5636] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/22)
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5644] device (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5666] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5676] device (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5697] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5709] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5717] device (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.5724] device (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.577 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.591 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 systemd-udevd[213332]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.612 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[13f49afb-36d9-40f7-9a03-249a786a2ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.615 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[9a705f39-bcfe-4421-ab6d-047c9be03aef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.6452] device (tap56b453c1-e0): carrier: link connected
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.652 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[82febe18-b97a-4135-b15b-2f4e172bfdd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.673 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[92721914-65bf-4c42-95a9-2d009948da80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56b453c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:4d:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 289730, 'reachable_time': 21669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213350, 'error': None, 'target': 'ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.692 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[92dd0c80-e9ee-4ae5-872d-4cfbed7a40f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:4d72'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 289730, 'tstamp': 289730}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213351, 'error': None, 'target': 'ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.712 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[21956bca-ffb8-4e28-bd3a-6ffee47bb828]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56b453c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:4d:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 289730, 'reachable_time': 21669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213352, 'error': None, 'target': 'ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.749 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcb5120-ea06-48a8-8eab-c4935fba6ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.786 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.816 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[699bfddd-5e72-4703-972c-a510a02ac048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.818 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56b453c1-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.818 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.818 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56b453c1-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.820 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 NetworkManager[55458]: <info>  [1763949430.8211] manager: (tap56b453c1-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 24 01:57:10 compute-0 kernel: tap56b453c1-e0: entered promiscuous mode
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.824 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.825 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap56b453c1-e0, col_values=(('external_ids', {'iface-id': '649ed6c5-be61-432a-8737-5ac2ae18aea0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.826 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 ovn_controller[95380]: 2025-11-24T01:57:10Z|00031|binding|INFO|Releasing lport 649ed6c5-be61-432a-8737-5ac2ae18aea0 from this chassis (sb_readonly=0)
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.870 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.871 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/56b453c1-ee78-40be-9431-0afc399d7dbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/56b453c1-ee78-40be-9431-0afc399d7dbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.872 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[259a5560-7167-4e0a-a633-edd705e827ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.873 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: global
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-56b453c1-ee78-40be-9431-0afc399d7dbc
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/56b453c1-ee78-40be-9431-0afc399d7dbc.pid.haproxy
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 56b453c1-ee78-40be-9431-0afc399d7dbc
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 01:57:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:10.873 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc', 'env', 'PROCESS_TAG=haproxy-56b453c1-ee78-40be-9431-0afc399d7dbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/56b453c1-ee78-40be-9431-0afc399d7dbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 01:57:10 compute-0 nova_compute[186999]: 2025-11-24 01:57:10.874 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:11 compute-0 podman[213384]: 2025-11-24 01:57:11.258775311 +0000 UTC m=+0.057766922 container create f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.263 187003 DEBUG nova.compute.manager [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.263 187003 DEBUG nova.compute.manager [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing instance network info cache due to event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.263 187003 DEBUG oslo_concurrency.lockutils [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.264 187003 DEBUG oslo_concurrency.lockutils [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.264 187003 DEBUG nova.network.neutron [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:57:11 compute-0 systemd[1]: Started libpod-conmon-f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7.scope.
Nov 24 01:57:11 compute-0 podman[213384]: 2025-11-24 01:57:11.228234508 +0000 UTC m=+0.027226129 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:57:11 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:57:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2a061fa901ba563c24a90ea81f8a5cead5184654c05133817f4e23be98631cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:57:11 compute-0 podman[213384]: 2025-11-24 01:57:11.352560563 +0000 UTC m=+0.151552174 container init f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:57:11 compute-0 podman[213384]: 2025-11-24 01:57:11.360339616 +0000 UTC m=+0.159331217 container start f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:57:11 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [NOTICE]   (213403) : New worker (213405) forked
Nov 24 01:57:11 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [NOTICE]   (213403) : Loading success.
Nov 24 01:57:11 compute-0 nova_compute[186999]: 2025-11-24 01:57:11.605 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.612 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}43828986269c1696035cbad96c3769c98254d94b00b21516fe826c9a9469238c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.708 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Mon, 24 Nov 2025 01:57:11 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-80039e7d-1d81-4117-8e61-372ac5697179 x-openstack-request-id: req-80039e7d-1d81-4117-8e61-372ac5697179 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.708 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1a6c1b83-8cec-446b-85f8-bd5fc311334e", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1a6c1b83-8cec-446b-85f8-bd5fc311334e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1a6c1b83-8cec-446b-85f8-bd5fc311334e"}]}, {"id": "b1e8dafc-0e0f-4b06-ab61-2691966769fd", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.708 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-80039e7d-1d81-4117-8e61-372ac5697179 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.711 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}43828986269c1696035cbad96c3769c98254d94b00b21516fe826c9a9469238c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.809 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Mon, 24 Nov 2025 01:57:11 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9712d73a-5798-4acc-9d8a-5d491a892da8 x-openstack-request-id: req-9712d73a-5798-4acc-9d8a-5d491a892da8 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.810 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "b1e8dafc-0e0f-4b06-ab61-2691966769fd", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.810 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/b1e8dafc-0e0f-4b06-ab61-2691966769fd used request id req-9712d73a-5798-4acc-9d8a-5d491a892da8 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.811 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'name': 'tempest-TestNetworkBasicOps-server-176359546', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'hostId': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.811 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.837 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.latency volume: 342666155 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.837 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.latency volume: 2795010 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61014ae2-b61a-474f-a547-05171a86a52c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 342666155, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.811961', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e45e86de-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '860dcac10868351e2a09d08a1f076baf5247b3bea14999b20708d92cd0624f0b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2795010, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.811961', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e45e96ce-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '2a1fdc5672713a488b8ac3756db5d97b507c1f5a50ae463e7da98977ea4eb991'}]}, 'timestamp': '2025-11-24 01:57:11.838255', '_unique_id': '348f3a1b41ba43cc8089ffea620895c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.846 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.849 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.849 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.850 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd01ee355-d45c-424e-8715-c9f39c68ed52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.849834', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4606f1c-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '9dccabc8a275c9e1cbf620983ff34fecf01ec0e6895abdbd8d0037895b0ca7e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.849834', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4607c46-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '5368101502173b8d1f913cd8bbf7794dfdd5468938c782246387c4645c06cdd6'}]}, 'timestamp': '2025-11-24 01:57:11.850662', '_unique_id': '6869859bbf2c4a3d8cab334af7849d69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.851 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.856 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a4dcff35-86ac-46bc-939c-dc6316ffd80f / tapb1da3c12-e6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.856 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef4db579-2925-4b88-aeb6-f6d628df0fc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.852565', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46174f2-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': 'e7d2b8d05efb136c16072cb57763d6d717cafba7a251b93c55e4a8614ea249d7'}]}, 'timestamp': '2025-11-24 01:57:11.857156', '_unique_id': 'f306260e81444cbab0aed91d8132ba6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.858 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.859 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.882 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.882 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a4dcff35-86ac-46bc-939c-dc6316ffd80f: ceilometer.compute.pollsters.NoVolumeException
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.883 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.883 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>]
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.883 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.884 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '225cf04d-c1f7-4bdd-9bcd-169bad492300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.883940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4659fdc-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '4c40ad449a37bbb849917a8882441ad3cfca242d2039345ad923fa688cdc3a8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.883940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e465ab30-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': 'eedaf37306e1e220300e707e5ffac85e908855fe4c74aaa309ba024072a48a94'}]}, 'timestamp': '2025-11-24 01:57:11.884592', '_unique_id': '591c4d5ea2f44f4496982d62c3528a5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.885 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.900 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.900 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eeddc14-5d74-4169-acf6-6f17ac2827cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.886863', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4681a28-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': '34274abde728fa76be00d6e8d5d30b9215df8fdf3ae4d592709066b48a5185ea'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.886863', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e46825a4-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': '97e374e90771922511f2a01c930e6f5e01e5fe53e205e621b3eeca7b739282ce'}]}, 'timestamp': '2025-11-24 01:57:11.900826', '_unique_id': '124fd6c963d04898ba2f0ce3a8c11add'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.901 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.902 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7b288e-07ce-41cf-91f1-6fd0bfc65584', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.902385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e4687356-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '3ce82cbe9450e3643fe88b8991b6f897bd4312f78606ea9490330e62cf447362'}]}, 'timestamp': '2025-11-24 01:57:11.902828', '_unique_id': '5e5bf9c78074404fa79790811b858d0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.903 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.904 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>]
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.904 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.904 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.904 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46f5a1cd-aad9-42ca-87b9-60d6863d8fc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.904358', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e468ba0a-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '013b03269c0f983d089278b31e0c1a8aad3aea4ad8d368a16cdda7dc994e0764'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.904358', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e468c4a0-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '829e5a3ad964bb3cd67e0b77d6e376b451e9b06f13763f47dde7f9fe3c5eefae'}]}, 'timestamp': '2025-11-24 01:57:11.904899', '_unique_id': '50d832e477cc43358db49cbd7cdf3fa4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.906 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.906 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25f7642c-cd1f-40ba-9c0c-58ecc9406af2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.906014', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e468f9f2-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '283a1a7f9f786d02f841b5b58b3026626ede66afd072a6e51c946acfc87f323f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.906014', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e46903d4-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '493abde5deb45b3a19f30c02e80079392531193d4309e481b647c4f2951084b2'}]}, 'timestamp': '2025-11-24 01:57:11.906521', '_unique_id': '1385bbc1a6424634b982e6b083a17792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.907 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdef896e-85be-4e42-a011-317750935bb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.907684', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e4693afc-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '550b119be9affa3712b1f19724e4fe4fd84ebe3d87477808c43e61b0e111f09d'}]}, 'timestamp': '2025-11-24 01:57:11.907948', '_unique_id': 'e3fb6e6563024f718c1dc778c8e41152'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ded6ec8-4f54-45a5-ac1e-fa1c2553057a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.909085', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e469715c-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '4610572af9360a2af57d978ce11ed00e001aadf54cacd358ec8d9b9eaabfc978'}]}, 'timestamp': '2025-11-24 01:57:11.909314', '_unique_id': 'bedf376cdd024e4482d0b3d1418ad169'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.909 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>]
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-176359546>]
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.910 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3462258-b1b2-434b-849c-7b39dbd1eb28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.910955', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e469ba72-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': 'c25191004262442d375dd4a46e4d8fa954126b64df8f9910650532473f5d9d97'}]}, 'timestamp': '2025-11-24 01:57:11.911207', '_unique_id': 'e1e016f4bb434918b3a58b42e0eb8d57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.911 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.912 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.912 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/cpu volume: 4400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53967bc3-a7be-4080-8977-6c64c1f1bb99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4400000000, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'timestamp': '2025-11-24T01:57:11.912351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e469f122-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.596157347, 'message_signature': 'c6d0d9f9497da8c3fd7017bea9ee13cbb378d20096c8fd4b94722b0a63a6f162'}]}, 'timestamp': '2025-11-24 01:57:11.912592', '_unique_id': '57c263e1e0974e8a9a6608f8c8946889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.913 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917b5d5c-71c5-4e9b-80f7-9d8492ea7505', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.913782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46a2930-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '1a1aa2c892c4a86a9d9dd4baf4217175dc2ebe0a98bcab976ce7b150546e037a'}]}, 'timestamp': '2025-11-24 01:57:11.914023', '_unique_id': '1403c003e76a4535bd3f69004f02b36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.914 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '656b0564-5054-42ab-ad09-82ece9e72354', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.915099', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46a5c20-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '930a0416d8781a9b4e43e91cd7ae84adc00ffb9cb4dd25804132bc35c6f20946'}]}, 'timestamp': '2025-11-24 01:57:11.915324', '_unique_id': 'a457be9ec41e48d683e1854c2bbfd208'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.915 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.916 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.916 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e0b147-f8d7-44bb-ae45-0ddc0ce294e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.916395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e46a8f1a-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': '5f133ab861cb1b62b6b17c2f95b9e3324728447f5d099820f8391a374c81f2a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.916395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e46a9802-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': '6cd02faa899159ebb7aad342f530eac71869f82b485a77f12af201ce6689601e'}]}, 'timestamp': '2025-11-24 01:57:11.916844', '_unique_id': 'adf452188c42486e915b8955007b67c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.917 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f99edc35-c142-4a69-99c7-7b169ad2a062', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.917932', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e46acab6-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': '9baaf83b8dd13f503b8e2f61b40dcbf8dcc720f1c91f4e3ab66244a0dfbecd36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.917932', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e46ad268-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.528305747, 'message_signature': 'fb8b26db1a6b61b1cf7573eb52797d265600a0b059512a34eee6fec31a136acc'}]}, 'timestamp': '2025-11-24 01:57:11.918337', '_unique_id': 'e39e50e79cea45688f515a114d848487'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.918 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.919 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1f41f19-13dc-42ac-948f-ccda14123563', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.919447', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46b0602-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '88ba4499e8b24272238c4aa1fdff41f09aad5a7317e2bef006a973d751b168fb'}]}, 'timestamp': '2025-11-24 01:57:11.919674', '_unique_id': '056ab2cef7c34ffb9eb43edf3b84ac36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.920 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d835886-a1f3-4d24-9567-7245b396e785', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-vda', 'timestamp': '2025-11-24T01:57:11.920806', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e46b3d5c-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': 'd59c3936ff13474240bd07a05587e5281e7a048c0bf1b3bda2adcfe2b9d8ad84'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f-sda', 'timestamp': '2025-11-24T01:57:11.920806', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'instance-00000001', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e46b4568-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.603194829, 'message_signature': '7e3c3d46bc9565ec1ec419d6ee700f52d85add96669b733eb62f7ccf8cdea41a'}]}, 'timestamp': '2025-11-24 01:57:11.921281', '_unique_id': 'c38d2fa119b249b2bdd4fc7847c2b598'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.921 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.922 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6813cd7b-5d75-4b7a-949c-ec10a6db22aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.922447', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46b7bfa-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '6ffde86e898c50babfc9253511f319b38ba4addd185f40d2a6df9e77dad86141'}]}, 'timestamp': '2025-11-24 01:57:11.922695', '_unique_id': '501092ee024c4e7da061bd3387ba64e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.923 12 DEBUG ceilometer.compute.pollsters [-] a4dcff35-86ac-46bc-939c-dc6316ffd80f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3013b1b-f8e9-41e0-8f4b-905c017af727', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000001-a4dcff35-86ac-46bc-939c-dc6316ffd80f-tapb1da3c12-e6', 'timestamp': '2025-11-24T01:57:11.923797', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-176359546', 'name': 'tapb1da3c12-e6', 'instance_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:3f:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1da3c12-e6'}, 'message_id': 'e46bb0ca-c8d8-11f0-959b-fa163eb968c1', 'monotonic_time': 2898.568928999, 'message_signature': '42068e186b2db5ec958420709ffe3e00cc346e03854953775a2b80f65f75fa61'}]}, 'timestamp': '2025-11-24 01:57:11.924045', '_unique_id': '484564dc534149919c81de518d5032aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:57:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:57:11.924 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:57:12 compute-0 nova_compute[186999]: 2025-11-24 01:57:12.380 187003 DEBUG nova.network.neutron [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updated VIF entry in instance network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:57:12 compute-0 nova_compute[186999]: 2025-11-24 01:57:12.380 187003 DEBUG nova.network.neutron [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:57:12 compute-0 nova_compute[186999]: 2025-11-24 01:57:12.397 187003 DEBUG oslo_concurrency.lockutils [req-b676b6f5-b81f-42ce-ac66-121d0b1e2881 req-ded65854-5f33-4ba9-8ea5-b7afbde34424 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:57:15 compute-0 nova_compute[186999]: 2025-11-24 01:57:15.795 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:16 compute-0 nova_compute[186999]: 2025-11-24 01:57:16.609 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:16 compute-0 podman[213414]: 2025-11-24 01:57:16.810172741 +0000 UTC m=+0.059487302 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 01:57:19 compute-0 podman[213452]: 2025-11-24 01:57:19.809767948 +0000 UTC m=+0.064266598 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter)
Nov 24 01:57:19 compute-0 ovn_controller[95380]: 2025-11-24T01:57:19Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:3f:6b 10.100.0.4
Nov 24 01:57:19 compute-0 ovn_controller[95380]: 2025-11-24T01:57:19Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:3f:6b 10.100.0.4
Nov 24 01:57:19 compute-0 sshd-session[213449]: Received disconnect from 154.90.59.75 port 48594:11: Bye Bye [preauth]
Nov 24 01:57:19 compute-0 sshd-session[213449]: Disconnected from authenticating user root 154.90.59.75 port 48594 [preauth]
Nov 24 01:57:20 compute-0 nova_compute[186999]: 2025-11-24 01:57:20.799 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:21 compute-0 nova_compute[186999]: 2025-11-24 01:57:21.642 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:25 compute-0 nova_compute[186999]: 2025-11-24 01:57:25.537 187003 INFO nova.compute.manager [None req-b7446a74-4629-44e4-921a-cd797d6e867c e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Get console output
Nov 24 01:57:25 compute-0 nova_compute[186999]: 2025-11-24 01:57:25.663 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 01:57:25 compute-0 nova_compute[186999]: 2025-11-24 01:57:25.802 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:26 compute-0 nova_compute[186999]: 2025-11-24 01:57:26.645 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:26 compute-0 podman[213474]: 2025-11-24 01:57:26.829817954 +0000 UTC m=+0.071474835 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 01:57:30 compute-0 nova_compute[186999]: 2025-11-24 01:57:30.804 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:30 compute-0 podman[213498]: 2025-11-24 01:57:30.808295653 +0000 UTC m=+0.063580299 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 01:57:31 compute-0 nova_compute[186999]: 2025-11-24 01:57:31.647 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:35 compute-0 podman[213519]: 2025-11-24 01:57:35.803204929 +0000 UTC m=+0.057507646 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 24 01:57:35 compute-0 nova_compute[186999]: 2025-11-24 01:57:35.807 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:36 compute-0 nova_compute[186999]: 2025-11-24 01:57:36.650 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.235 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.235 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.249 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.306 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.307 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.314 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.315 187003 INFO nova.compute.claims [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Claim successful on node compute-0.ctlplane.example.com
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.412 187003 DEBUG nova.compute.provider_tree [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.421 187003 DEBUG nova.scheduler.client.report [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.437 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.438 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.481 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.482 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.496 187003 INFO nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.516 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.596 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.597 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.597 187003 INFO nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Creating image(s)
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.598 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.598 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.598 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.610 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.698 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.699 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.700 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.711 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.776 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.777 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.826 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.827 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.828 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:38 compute-0 podman[213543]: 2025-11-24 01:57:38.833334748 +0000 UTC m=+0.072365620 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:57:38 compute-0 podman[213544]: 2025-11-24 01:57:38.840942796 +0000 UTC m=+0.086474934 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.880 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.882 187003 DEBUG nova.virt.disk.api [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.883 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.935 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.936 187003 DEBUG nova.virt.disk.api [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.937 187003 DEBUG nova.objects.instance [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.953 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.954 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Ensure instance console log exists: /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.954 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.955 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:38 compute-0 nova_compute[186999]: 2025-11-24 01:57:38.955 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:39 compute-0 nova_compute[186999]: 2025-11-24 01:57:39.605 187003 DEBUG nova.policy [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 01:57:40 compute-0 nova_compute[186999]: 2025-11-24 01:57:40.810 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:41 compute-0 nova_compute[186999]: 2025-11-24 01:57:41.686 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:42 compute-0 nova_compute[186999]: 2025-11-24 01:57:42.638 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Successfully created port: f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 01:57:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:44.093 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:57:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:44.095 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 01:57:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:44.096 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.135 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.310 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Successfully updated port: f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.328 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.328 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.329 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.440 187003 DEBUG nova.compute.manager [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-changed-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.441 187003 DEBUG nova.compute.manager [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Refreshing instance network info cache due to event network-changed-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.442 187003 DEBUG oslo_concurrency.lockutils [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:57:44 compute-0 nova_compute[186999]: 2025-11-24 01:57:44.517 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.238 187003 DEBUG nova.network.neutron [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Updating instance_info_cache with network_info: [{"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.253 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.254 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Instance network_info: |[{"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.255 187003 DEBUG oslo_concurrency.lockutils [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.255 187003 DEBUG nova.network.neutron [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Refreshing network info cache for port f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.259 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Start _get_guest_xml network_info=[{"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.263 187003 WARNING nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.269 187003 DEBUG nova.virt.libvirt.host [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.269 187003 DEBUG nova.virt.libvirt.host [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.272 187003 DEBUG nova.virt.libvirt.host [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.273 187003 DEBUG nova.virt.libvirt.host [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.274 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.274 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.275 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.276 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.276 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.276 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.277 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.277 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.278 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.278 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.279 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.279 187003 DEBUG nova.virt.hardware [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.286 187003 DEBUG nova.virt.libvirt.vif [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099808998',display_name='tempest-TestNetworkBasicOps-server-2099808998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099808998',id=2,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDAHsh2kLuquw00ft5Hc6vWzJ+BBVHAQMlJKxd3wyTX98lAip7YQZRK/ov68t+Ni7vk8VaEeIyZzP2WZ9VhF8yXPJ54T4f1YYa4h1+tjeheIOjUHE7znLFDEDdK3+6zUw==',key_name='tempest-TestNetworkBasicOps-372209157',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-smi5i0mb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:57:38Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.287 187003 DEBUG nova.network.os_vif_util [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.288 187003 DEBUG nova.network.os_vif_util [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.289 187003 DEBUG nova.objects.instance [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.303 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] End _get_guest_xml xml=<domain type="kvm">
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <uuid>fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b</uuid>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <name>instance-00000002</name>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-2099808998</nova:name>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 01:57:45</nova:creationTime>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         <nova:port uuid="f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661">
Nov 24 01:57:45 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <system>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="serial">fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="uuid">fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </system>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <os>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </os>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <features>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </features>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.config"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:cc:3d:b1"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <target dev="tapf14a97a8-8f"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/console.log" append="off"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <video>
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </video>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 01:57:45 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 01:57:45 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:57:45 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:57:45 compute-0 nova_compute[186999]: </domain>
Nov 24 01:57:45 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.304 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Preparing to wait for external event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.304 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.305 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.306 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.307 187003 DEBUG nova.virt.libvirt.vif [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099808998',display_name='tempest-TestNetworkBasicOps-server-2099808998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099808998',id=2,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDAHsh2kLuquw00ft5Hc6vWzJ+BBVHAQMlJKxd3wyTX98lAip7YQZRK/ov68t+Ni7vk8VaEeIyZzP2WZ9VhF8yXPJ54T4f1YYa4h1+tjeheIOjUHE7znLFDEDdK3+6zUw==',key_name='tempest-TestNetworkBasicOps-372209157',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-smi5i0mb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:57:38Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.308 187003 DEBUG nova.network.os_vif_util [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.310 187003 DEBUG nova.network.os_vif_util [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.311 187003 DEBUG os_vif [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.312 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.313 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.314 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.319 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.320 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf14a97a8-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.321 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf14a97a8-8f, col_values=(('external_ids', {'iface-id': 'f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:3d:b1', 'vm-uuid': 'fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.365 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 NetworkManager[55458]: <info>  [1763949465.3667] manager: (tapf14a97a8-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.369 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.376 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.377 187003 INFO os_vif [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f')
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.436 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.437 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.437 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:cc:3d:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.438 187003 INFO nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Using config drive
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.732 187003 INFO nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Creating config drive at /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.config
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.737 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwjp3a8n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.874 187003 DEBUG oslo_concurrency.processutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwjp3a8n" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:57:45 compute-0 kernel: tapf14a97a8-8f: entered promiscuous mode
Nov 24 01:57:45 compute-0 NetworkManager[55458]: <info>  [1763949465.9315] manager: (tapf14a97a8-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.932 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 ovn_controller[95380]: 2025-11-24T01:57:45Z|00032|binding|INFO|Claiming lport f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 for this chassis.
Nov 24 01:57:45 compute-0 ovn_controller[95380]: 2025-11-24T01:57:45Z|00033|binding|INFO|f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661: Claiming fa:16:3e:cc:3d:b1 10.100.0.20
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.944 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:3d:b1 10.100.0.20'], port_security=['fa:16:3e:cc:3d:b1 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9582bfd7-28d5-4551-95f0-fded72a25ea6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d060aa1-adff-4fcd-97d0-52db12420404, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.946 104238 INFO neutron.agent.ovn.metadata.agent [-] Port f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 in datapath 6357defe-7391-4a0a-bf5c-a6e905d3faf9 bound to our chassis
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.947 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6357defe-7391-4a0a-bf5c-a6e905d3faf9
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.957 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[11cbb0d7-fedb-435a-8452-a0b99b70a9ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.958 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6357defe-71 in ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.960 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6357defe-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.960 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a3805b-7ed4-4b82-8362-3cc84f7b8afa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.961 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb16a03-00af-4b75-96c0-1cc875c6282d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.965 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 ovn_controller[95380]: 2025-11-24T01:57:45Z|00034|binding|INFO|Setting lport f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 ovn-installed in OVS
Nov 24 01:57:45 compute-0 ovn_controller[95380]: 2025-11-24T01:57:45Z|00035|binding|INFO|Setting lport f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 up in Southbound
Nov 24 01:57:45 compute-0 nova_compute[186999]: 2025-11-24 01:57:45.969 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:45 compute-0 systemd-machined[153319]: New machine qemu-2-instance-00000002.
Nov 24 01:57:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:45.983 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[258f3ac6-2407-4938-ae94-a5c8169684b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.012 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0a699b65-9b78-428f-a825-ca9bed0e46d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 systemd-udevd[213626]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:57:46 compute-0 NetworkManager[55458]: <info>  [1763949466.0346] device (tapf14a97a8-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:57:46 compute-0 NetworkManager[55458]: <info>  [1763949466.0366] device (tapf14a97a8-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.047 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[9124473b-8a5a-44c9-b684-328ef579dc7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.051 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[adfa20d0-ed17-4c34-8d18-e7adf6e2a9b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 NetworkManager[55458]: <info>  [1763949466.0535] manager: (tap6357defe-70): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.080 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[67fbccc4-fc9c-473a-8299-17154374a7f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.083 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[df00f448-e982-4ef1-a8dc-86dc0292e82e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 NetworkManager[55458]: <info>  [1763949466.1045] device (tap6357defe-70): carrier: link connected
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.108 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[506c7904-dc12-44c5-a9ea-33728e83952f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.122 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[af4a5fb2-4f5a-42de-9d8b-367a60596697]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6357defe-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:d9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293276, 'reachable_time': 34704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213656, 'error': None, 'target': 'ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.133 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8f5b9a-e0d1-4a7b-abc6-a8f3183ba9df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:d9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 293276, 'tstamp': 293276}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213657, 'error': None, 'target': 'ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.146 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab59635-4187-43e3-895a-a2bd4ff10712]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6357defe-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:d9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293276, 'reachable_time': 34704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213658, 'error': None, 'target': 'ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.173 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9aedc105-16b5-41e6-a3d9-f50339ad8825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.224 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c16697-88b6-4312-9e8e-4ccb53cea222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.226 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6357defe-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.226 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.226 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6357defe-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.228 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:46 compute-0 NetworkManager[55458]: <info>  [1763949466.2290] manager: (tap6357defe-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 24 01:57:46 compute-0 kernel: tap6357defe-70: entered promiscuous mode
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.231 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.233 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6357defe-70, col_values=(('external_ids', {'iface-id': '463d1a27-9b98-4dd7-97ca-5298f02afe92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.234 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:46 compute-0 ovn_controller[95380]: 2025-11-24T01:57:46Z|00036|binding|INFO|Releasing lport 463d1a27-9b98-4dd7-97ca-5298f02afe92 from this chassis (sb_readonly=0)
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.253 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.254 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6357defe-7391-4a0a-bf5c-a6e905d3faf9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6357defe-7391-4a0a-bf5c-a6e905d3faf9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.255 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfbaa72-6204-4cf3-86dd-8b2adbaef0d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.256 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: global
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-6357defe-7391-4a0a-bf5c-a6e905d3faf9
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/6357defe-7391-4a0a-bf5c-a6e905d3faf9.pid.haproxy
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 6357defe-7391-4a0a-bf5c-a6e905d3faf9
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 01:57:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:46.256 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'env', 'PROCESS_TAG=haproxy-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6357defe-7391-4a0a-bf5c-a6e905d3faf9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.505 187003 DEBUG nova.network.neutron [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Updated VIF entry in instance network info cache for port f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.506 187003 DEBUG nova.network.neutron [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Updating instance_info_cache with network_info: [{"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.520 187003 DEBUG oslo_concurrency.lockutils [req-9c34b2f1-3d08-4a11-be0e-e26bd1426be6 req-ece0de2f-50dc-41d8-981a-3d6c7f1800de 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.557 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949466.557124, fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.557 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] VM Started (Lifecycle Event)
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.592 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.596 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949466.5595458, fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.596 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] VM Paused (Lifecycle Event)
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.608 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.612 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.629 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.689 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:46 compute-0 podman[213697]: 2025-11-24 01:57:46.720940071 +0000 UTC m=+0.074434645 container create dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:57:46 compute-0 systemd[1]: Started libpod-conmon-dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75.scope.
Nov 24 01:57:46 compute-0 podman[213697]: 2025-11-24 01:57:46.689424609 +0000 UTC m=+0.042919173 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:57:46 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:57:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bf0624ec8969fe4dfa734a0bbaa2ed00fd2dcf22028be06562d9ab6b38016b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:57:46 compute-0 podman[213697]: 2025-11-24 01:57:46.828219226 +0000 UTC m=+0.181713790 container init dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:57:46 compute-0 podman[213697]: 2025-11-24 01:57:46.834556153 +0000 UTC m=+0.188050697 container start dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 24 01:57:46 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [NOTICE]   (213717) : New worker (213719) forked
Nov 24 01:57:46 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [NOTICE]   (213717) : Loading success.
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.875 187003 DEBUG nova.compute.manager [req-80bf206b-bd50-4a9b-a16f-92e28b14d44a req-f285dfd6-7079-423c-9522-1ba7329adc6e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.876 187003 DEBUG oslo_concurrency.lockutils [req-80bf206b-bd50-4a9b-a16f-92e28b14d44a req-f285dfd6-7079-423c-9522-1ba7329adc6e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.876 187003 DEBUG oslo_concurrency.lockutils [req-80bf206b-bd50-4a9b-a16f-92e28b14d44a req-f285dfd6-7079-423c-9522-1ba7329adc6e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.876 187003 DEBUG oslo_concurrency.lockutils [req-80bf206b-bd50-4a9b-a16f-92e28b14d44a req-f285dfd6-7079-423c-9522-1ba7329adc6e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.876 187003 DEBUG nova.compute.manager [req-80bf206b-bd50-4a9b-a16f-92e28b14d44a req-f285dfd6-7079-423c-9522-1ba7329adc6e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Processing event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.877 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.881 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949466.881291, fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.882 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] VM Resumed (Lifecycle Event)
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.887 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.898 187003 INFO nova.virt.libvirt.driver [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Instance spawned successfully.
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.898 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.905 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.907 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.915 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.915 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.916 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.916 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.916 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.917 187003 DEBUG nova.virt.libvirt.driver [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.921 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.960 187003 INFO nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Took 8.36 seconds to spawn the instance on the hypervisor.
Nov 24 01:57:46 compute-0 nova_compute[186999]: 2025-11-24 01:57:46.960 187003 DEBUG nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:57:47 compute-0 nova_compute[186999]: 2025-11-24 01:57:47.008 187003 INFO nova.compute.manager [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Took 8.72 seconds to build instance.
Nov 24 01:57:47 compute-0 nova_compute[186999]: 2025-11-24 01:57:47.022 187003 DEBUG oslo_concurrency.lockutils [None req-b685f112-b34a-4cbd-b944-50449057a01d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:47 compute-0 podman[213728]: 2025-11-24 01:57:47.868383882 +0000 UTC m=+0.105069873 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 24 01:57:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:48.417 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:48.419 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:57:48.420 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.956 187003 DEBUG nova.compute.manager [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.958 187003 DEBUG oslo_concurrency.lockutils [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.958 187003 DEBUG oslo_concurrency.lockutils [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.959 187003 DEBUG oslo_concurrency.lockutils [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.959 187003 DEBUG nova.compute.manager [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] No waiting events found dispatching network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:57:48 compute-0 nova_compute[186999]: 2025-11-24 01:57:48.960 187003 WARNING nova.compute.manager [req-fbb8240b-4cc9-48ba-8f69-6fd6e4c8f24e req-c2f33b37-9527-4387-9156-ad6e6ca48a09 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received unexpected event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 for instance with vm_state active and task_state None.
Nov 24 01:57:50 compute-0 nova_compute[186999]: 2025-11-24 01:57:50.367 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:50 compute-0 podman[213750]: 2025-11-24 01:57:50.826706324 +0000 UTC m=+0.077912143 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 01:57:51 compute-0 nova_compute[186999]: 2025-11-24 01:57:51.690 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:55 compute-0 nova_compute[186999]: 2025-11-24 01:57:55.371 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:56 compute-0 nova_compute[186999]: 2025-11-24 01:57:56.691 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:57:57 compute-0 podman[213775]: 2025-11-24 01:57:57.831486028 +0000 UTC m=+0.071439861 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:57:59 compute-0 ovn_controller[95380]: 2025-11-24T01:57:59Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:3d:b1 10.100.0.20
Nov 24 01:57:59 compute-0 ovn_controller[95380]: 2025-11-24T01:57:59Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:3d:b1 10.100.0.20
Nov 24 01:58:00 compute-0 nova_compute[186999]: 2025-11-24 01:58:00.375 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:00 compute-0 sshd-session[213815]: Invalid user ftpuser1 from 46.188.119.26 port 36130
Nov 24 01:58:00 compute-0 sshd-session[213815]: Received disconnect from 46.188.119.26 port 36130:11: Bye Bye [preauth]
Nov 24 01:58:00 compute-0 sshd-session[213815]: Disconnected from invalid user ftpuser1 46.188.119.26 port 36130 [preauth]
Nov 24 01:58:01 compute-0 nova_compute[186999]: 2025-11-24 01:58:01.694 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:01 compute-0 podman[213817]: 2025-11-24 01:58:01.803433383 +0000 UTC m=+0.054006413 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:58:04 compute-0 nova_compute[186999]: 2025-11-24 01:58:04.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:05 compute-0 nova_compute[186999]: 2025-11-24 01:58:05.413 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:05 compute-0 nova_compute[186999]: 2025-11-24 01:58:05.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.603 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.604 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.604 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.605 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.605 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.609 187003 INFO nova.compute.manager [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Terminating instance
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.611 187003 DEBUG nova.compute.manager [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 01:58:06 compute-0 kernel: tapf14a97a8-8f (unregistering): left promiscuous mode
Nov 24 01:58:06 compute-0 NetworkManager[55458]: <info>  [1763949486.6434] device (tapf14a97a8-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 01:58:06 compute-0 ovn_controller[95380]: 2025-11-24T01:58:06Z|00037|binding|INFO|Releasing lport f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 from this chassis (sb_readonly=0)
Nov 24 01:58:06 compute-0 ovn_controller[95380]: 2025-11-24T01:58:06Z|00038|binding|INFO|Setting lport f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 down in Southbound
Nov 24 01:58:06 compute-0 ovn_controller[95380]: 2025-11-24T01:58:06Z|00039|binding|INFO|Removing iface tapf14a97a8-8f ovn-installed in OVS
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.702 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:06.711 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:3d:b1 10.100.0.20'], port_security=['fa:16:3e:cc:3d:b1 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9582bfd7-28d5-4551-95f0-fded72a25ea6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d060aa1-adff-4fcd-97d0-52db12420404, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:58:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:06.715 104238 INFO neutron.agent.ovn.metadata.agent [-] Port f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 in datapath 6357defe-7391-4a0a-bf5c-a6e905d3faf9 unbound from our chassis
Nov 24 01:58:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:06.717 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6357defe-7391-4a0a-bf5c-a6e905d3faf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.717 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:06.718 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf8fd6c-3d59-4756-ba09-7a50b5beb6f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:06 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:06.720 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9 namespace which is not needed anymore
Nov 24 01:58:06 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 24 01:58:06 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 12.389s CPU time.
Nov 24 01:58:06 compute-0 systemd-machined[153319]: Machine qemu-2-instance-00000002 terminated.
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.798 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 24 01:58:06 compute-0 podman[213837]: 2025-11-24 01:58:06.829031945 +0000 UTC m=+0.095063033 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd)
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.857 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.867 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.898 187003 INFO nova.virt.libvirt.driver [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Instance destroyed successfully.
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.899 187003 DEBUG nova.objects.instance [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.911 187003 DEBUG nova.virt.libvirt.vif [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:57:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099808998',display_name='tempest-TestNetworkBasicOps-server-2099808998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099808998',id=2,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDAHsh2kLuquw00ft5Hc6vWzJ+BBVHAQMlJKxd3wyTX98lAip7YQZRK/ov68t+Ni7vk8VaEeIyZzP2WZ9VhF8yXPJ54T4f1YYa4h1+tjeheIOjUHE7znLFDEDdK3+6zUw==',key_name='tempest-TestNetworkBasicOps-372209157',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:57:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-smi5i0mb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:57:46Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.911 187003 DEBUG nova.network.os_vif_util [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "address": "fa:16:3e:cc:3d:b1", "network": {"id": "6357defe-7391-4a0a-bf5c-a6e905d3faf9", "bridge": "br-int", "label": "tempest-network-smoke--987815614", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf14a97a8-8f", "ovs_interfaceid": "f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.912 187003 DEBUG nova.network.os_vif_util [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.913 187003 DEBUG os_vif [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.914 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.915 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf14a97a8-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.916 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.917 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.919 187003 INFO os_vif [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:3d:b1,bridge_name='br-int',has_traffic_filtering=True,id=f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661,network=Network(6357defe-7391-4a0a-bf5c-a6e905d3faf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf14a97a8-8f')
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.920 187003 INFO nova.virt.libvirt.driver [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Deleting instance files /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b_del
Nov 24 01:58:06 compute-0 nova_compute[186999]: 2025-11-24 01:58:06.920 187003 INFO nova.virt.libvirt.driver [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Deletion of /var/lib/nova/instances/fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b_del complete
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.019 187003 DEBUG nova.virt.libvirt.host [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.020 187003 INFO nova.virt.libvirt.host [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] UEFI support detected
Nov 24 01:58:07 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [NOTICE]   (213717) : haproxy version is 2.8.14-c23fe91
Nov 24 01:58:07 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [NOTICE]   (213717) : path to executable is /usr/sbin/haproxy
Nov 24 01:58:07 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [WARNING]  (213717) : Exiting Master process...
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.021 187003 INFO nova.compute.manager [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.021 187003 DEBUG oslo.service.loopingcall [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.022 187003 DEBUG nova.compute.manager [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.022 187003 DEBUG nova.network.neutron [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 01:58:07 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [ALERT]    (213717) : Current worker (213719) exited with code 143 (Terminated)
Nov 24 01:58:07 compute-0 neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9[213713]: [WARNING]  (213717) : All workers exited. Exiting... (0)
Nov 24 01:58:07 compute-0 systemd[1]: libpod-dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75.scope: Deactivated successfully.
Nov 24 01:58:07 compute-0 podman[213878]: 2025-11-24 01:58:07.031963177 +0000 UTC m=+0.173313214 container died dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 01:58:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75-userdata-shm.mount: Deactivated successfully.
Nov 24 01:58:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-14bf0624ec8969fe4dfa734a0bbaa2ed00fd2dcf22028be06562d9ab6b38016b-merged.mount: Deactivated successfully.
Nov 24 01:58:07 compute-0 podman[213878]: 2025-11-24 01:58:07.072850312 +0000 UTC m=+0.214200349 container cleanup dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 01:58:07 compute-0 systemd[1]: libpod-conmon-dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75.scope: Deactivated successfully.
Nov 24 01:58:07 compute-0 podman[213924]: 2025-11-24 01:58:07.136566097 +0000 UTC m=+0.040726212 container remove dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.142 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[400837d1-7d00-43fb-b29a-1449e12e5dde]: (4, ('Mon Nov 24 01:58:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9 (dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75)\ndac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75\nMon Nov 24 01:58:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9 (dac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75)\ndac758498c8db7ba4221b13423c89e370ee5a6f0147f6e46e994783f3cfbee75\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.144 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[293ffe97-6551-42e9-b3ab-ba90158038c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.145 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6357defe-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.147 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:07 compute-0 kernel: tap6357defe-70: left promiscuous mode
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.172 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.175 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0d1e9f23-6a54-446c-a198-83dda851dd7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.193 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[09591c57-e2fb-411c-af4f-bec78f5589e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.194 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d423d7-c71b-4d95-ad6d-c54c1acee2e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.215 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f679cb93-c7ca-40ed-905a-e08835b312e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293270, 'reachable_time': 30921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213939, 'error': None, 'target': 'ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d6357defe\x2d7391\x2d4a0a\x2dbf5c\x2da6e905d3faf9.mount: Deactivated successfully.
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.230 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6357defe-7391-4a0a-bf5c-a6e905d3faf9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 01:58:07 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:07.231 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[361f2b06-0d7c-45a9-bbdd-b2572b54f59c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.591 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.591 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.591 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.592 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a4dcff35-86ac-46bc-939c-dc6316ffd80f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.713 187003 DEBUG nova.compute.manager [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-unplugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.713 187003 DEBUG oslo_concurrency.lockutils [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.714 187003 DEBUG oslo_concurrency.lockutils [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.714 187003 DEBUG oslo_concurrency.lockutils [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.715 187003 DEBUG nova.compute.manager [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] No waiting events found dispatching network-vif-unplugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:58:07 compute-0 nova_compute[186999]: 2025-11-24 01:58:07.715 187003 DEBUG nova.compute.manager [req-a8a0e06c-321c-4ad8-ab75-f376b2fa33b4 req-f1f3b96c-be6b-4669-aafe-bdcdeb4ccaa2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-unplugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 01:58:08 compute-0 nova_compute[186999]: 2025-11-24 01:58:08.911 187003 DEBUG nova.network.neutron [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:08 compute-0 nova_compute[186999]: 2025-11-24 01:58:08.929 187003 INFO nova.compute.manager [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Took 1.91 seconds to deallocate network for instance.
Nov 24 01:58:08 compute-0 nova_compute[186999]: 2025-11-24 01:58:08.970 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:08 compute-0 nova_compute[186999]: 2025-11-24 01:58:08.971 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.011 187003 DEBUG nova.compute.manager [req-42763230-4cf5-4cc2-a47e-a8326e055567 req-3fe0f677-b0b1-4d94-b834-839a21bb9613 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-deleted-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.055 187003 DEBUG nova.compute.provider_tree [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.067 187003 DEBUG nova.scheduler.client.report [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.083 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.123 187003 INFO nova.scheduler.client.report [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.211 187003 DEBUG oslo_concurrency.lockutils [None req-7e8a9501-aa48-4ec1-b279-7c280574b9c5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.579 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.590 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.591 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.591 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.591 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.591 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.592 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.610 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.610 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.611 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.611 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.699 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:09 compute-0 podman[213942]: 2025-11-24 01:58:09.72965034 +0000 UTC m=+0.066615147 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.760 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.761 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.785 187003 DEBUG nova.compute.manager [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.785 187003 DEBUG oslo_concurrency.lockutils [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.786 187003 DEBUG oslo_concurrency.lockutils [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.786 187003 DEBUG oslo_concurrency.lockutils [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.786 187003 DEBUG nova.compute.manager [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] No waiting events found dispatching network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.786 187003 WARNING nova.compute.manager [req-3fc97195-9760-4da7-bf63-a9a355c1b0f7 req-c9040c56-021e-4e58-8977-9f69eccf35dd 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Received unexpected event network-vif-plugged-f14a97a8-8fcd-4b9b-9a95-f39d2f3d8661 for instance with vm_state deleted and task_state None.
Nov 24 01:58:09 compute-0 podman[213943]: 2025-11-24 01:58:09.813395165 +0000 UTC m=+0.140357442 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.821 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.983 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.985 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5631MB free_disk=73.43330764770508GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.985 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:09 compute-0 nova_compute[186999]: 2025-11-24 01:58:09.985 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.128 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance a4dcff35-86ac-46bc-939c-dc6316ffd80f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.129 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.129 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.176 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.186 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.200 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.200 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.380 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:10 compute-0 nova_compute[186999]: 2025-11-24 01:58:10.380 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:11 compute-0 nova_compute[186999]: 2025-11-24 01:58:11.762 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:11 compute-0 nova_compute[186999]: 2025-11-24 01:58:11.917 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:12 compute-0 nova_compute[186999]: 2025-11-24 01:58:12.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:58:12 compute-0 nova_compute[186999]: 2025-11-24 01:58:12.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:58:12 compute-0 ovn_controller[95380]: 2025-11-24T01:58:12Z|00040|binding|INFO|Releasing lport 649ed6c5-be61-432a-8737-5ac2ae18aea0 from this chassis (sb_readonly=0)
Nov 24 01:58:12 compute-0 nova_compute[186999]: 2025-11-24 01:58:12.978 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.232 187003 DEBUG nova.compute.manager [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.232 187003 DEBUG nova.compute.manager [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing instance network info cache due to event network-changed-b1da3c12-e629-4325-be7d-3295c80a73da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.232 187003 DEBUG oslo_concurrency.lockutils [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.233 187003 DEBUG oslo_concurrency.lockutils [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.233 187003 DEBUG nova.network.neutron [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Refreshing network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.308 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.308 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.308 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.309 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.309 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.310 187003 INFO nova.compute.manager [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Terminating instance
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.311 187003 DEBUG nova.compute.manager [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 01:58:14 compute-0 kernel: tapb1da3c12-e6 (unregistering): left promiscuous mode
Nov 24 01:58:14 compute-0 NetworkManager[55458]: <info>  [1763949494.3381] device (tapb1da3c12-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.338 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 ovn_controller[95380]: 2025-11-24T01:58:14Z|00041|binding|INFO|Releasing lport b1da3c12-e629-4325-be7d-3295c80a73da from this chassis (sb_readonly=0)
Nov 24 01:58:14 compute-0 ovn_controller[95380]: 2025-11-24T01:58:14Z|00042|binding|INFO|Setting lport b1da3c12-e629-4325-be7d-3295c80a73da down in Southbound
Nov 24 01:58:14 compute-0 ovn_controller[95380]: 2025-11-24T01:58:14Z|00043|binding|INFO|Removing iface tapb1da3c12-e6 ovn-installed in OVS
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.343 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.355 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:3f:6b 10.100.0.4'], port_security=['fa:16:3e:92:3f:6b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a4dcff35-86ac-46bc-939c-dc6316ffd80f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56b453c1-ee78-40be-9431-0afc399d7dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87cc0efe-9bdc-4b2d-8d1b-45269d7fdc68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42306af5-a21b-4874-a8e2-8ada30faaa43, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=b1da3c12-e629-4325-be7d-3295c80a73da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.357 104238 INFO neutron.agent.ovn.metadata.agent [-] Port b1da3c12-e629-4325-be7d-3295c80a73da in datapath 56b453c1-ee78-40be-9431-0afc399d7dbc unbound from our chassis
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.357 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.358 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 56b453c1-ee78-40be-9431-0afc399d7dbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.359 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2344a386-7307-436a-bbb1-468b57429114]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.360 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc namespace which is not needed anymore
Nov 24 01:58:14 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 24 01:58:14 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.994s CPU time.
Nov 24 01:58:14 compute-0 systemd-machined[153319]: Machine qemu-1-instance-00000001 terminated.
Nov 24 01:58:14 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [NOTICE]   (213403) : haproxy version is 2.8.14-c23fe91
Nov 24 01:58:14 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [NOTICE]   (213403) : path to executable is /usr/sbin/haproxy
Nov 24 01:58:14 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [WARNING]  (213403) : Exiting Master process...
Nov 24 01:58:14 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [ALERT]    (213403) : Current worker (213405) exited with code 143 (Terminated)
Nov 24 01:58:14 compute-0 neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc[213399]: [WARNING]  (213403) : All workers exited. Exiting... (0)
Nov 24 01:58:14 compute-0 systemd[1]: libpod-f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7.scope: Deactivated successfully.
Nov 24 01:58:14 compute-0 podman[214026]: 2025-11-24 01:58:14.501350751 +0000 UTC m=+0.042874632 container died f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 01:58:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7-userdata-shm.mount: Deactivated successfully.
Nov 24 01:58:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-c2a061fa901ba563c24a90ea81f8a5cead5184654c05133817f4e23be98631cf-merged.mount: Deactivated successfully.
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.537 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 podman[214026]: 2025-11-24 01:58:14.538795149 +0000 UTC m=+0.080319030 container cleanup f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.541 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 systemd[1]: libpod-conmon-f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7.scope: Deactivated successfully.
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.578 187003 INFO nova.virt.libvirt.driver [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Instance destroyed successfully.
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.579 187003 DEBUG nova.objects.instance [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid a4dcff35-86ac-46bc-939c-dc6316ffd80f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.596 187003 DEBUG nova.virt.libvirt.vif [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:56:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-176359546',display_name='tempest-TestNetworkBasicOps-server-176359546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-176359546',id=1,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI7K6FXfe3f7i9M+Lrq9UdTuwMlrNMPu9xzDTj17VZVDGOtmQOma9x4vWlM1AXFT60jK8li/Bc1daG4yB3t2WpOteUGAiwqwPlxKCKEKz4j8h3i95vSMtJhtNQ0wmbjHlA==',key_name='tempest-TestNetworkBasicOps-1719143779',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:57:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-4mazle30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:57:07Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a4dcff35-86ac-46bc-939c-dc6316ffd80f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.597 187003 DEBUG nova.network.os_vif_util [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.598 187003 DEBUG nova.network.os_vif_util [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.598 187003 DEBUG os_vif [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.602 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.602 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1da3c12-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.604 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.605 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 podman[214061]: 2025-11-24 01:58:14.607875214 +0000 UTC m=+0.043918481 container remove f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.609 187003 INFO os_vif [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:3f:6b,bridge_name='br-int',has_traffic_filtering=True,id=b1da3c12-e629-4325-be7d-3295c80a73da,network=Network(56b453c1-ee78-40be-9431-0afc399d7dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1da3c12-e6')
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.610 187003 INFO nova.virt.libvirt.driver [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Deleting instance files /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f_del
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.611 187003 INFO nova.virt.libvirt.driver [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Deletion of /var/lib/nova/instances/a4dcff35-86ac-46bc-939c-dc6316ffd80f_del complete
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.612 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[41cdb862-f723-45f2-b3a9-573d21033f98]: (4, ('Mon Nov 24 01:58:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc (f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7)\nf1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7\nMon Nov 24 01:58:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc (f1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7)\nf1ce383ef8f0a89767bc67dcf402abb5dcef894f0f775c2f2a2f83d138abb6f7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.614 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb343ab-10ff-4f5b-b0b1-fd35d1baa13e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.615 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56b453c1-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.617 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 kernel: tap56b453c1-e0: left promiscuous mode
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.633 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.635 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[69d553fe-14ea-4731-a752-c2fc89c7193c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.647 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[87b56ce2-22b5-4fd5-b968-519c922fcefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.649 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[92cfde0e-a2c5-4e45-bf97-508ba6660323]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.661 187003 INFO nova.compute.manager [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.661 187003 DEBUG oslo.service.loopingcall [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.662 187003 DEBUG nova.compute.manager [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.662 187003 DEBUG nova.network.neutron [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.665 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[aa23ff6e-3583-42b5-b072-f0fc2da2075e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 289718, 'reachable_time': 18338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214083, 'error': None, 'target': 'ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.668 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-56b453c1-ee78-40be-9431-0afc399d7dbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 01:58:14 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:14.668 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[f6aac596-ce6a-45cb-8731-1ded03b4797f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d56b453c1\x2dee78\x2d40be\x2d9431\x2d0afc399d7dbc.mount: Deactivated successfully.
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.727 187003 DEBUG nova.compute.manager [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-unplugged-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.727 187003 DEBUG oslo_concurrency.lockutils [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.727 187003 DEBUG oslo_concurrency.lockutils [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.728 187003 DEBUG oslo_concurrency.lockutils [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.728 187003 DEBUG nova.compute.manager [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] No waiting events found dispatching network-vif-unplugged-b1da3c12-e629-4325-be7d-3295c80a73da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:58:14 compute-0 nova_compute[186999]: 2025-11-24 01:58:14.728 187003 DEBUG nova.compute.manager [req-0883ba57-a636-461d-a893-e8a2579c0173 req-e7782d0d-1358-4455-845a-f7fa504535f0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-unplugged-b1da3c12-e629-4325-be7d-3295c80a73da for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 01:58:15 compute-0 nova_compute[186999]: 2025-11-24 01:58:15.963 187003 DEBUG nova.network.neutron [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:15 compute-0 nova_compute[186999]: 2025-11-24 01:58:15.977 187003 INFO nova.compute.manager [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Took 1.32 seconds to deallocate network for instance.
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.023 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.024 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.080 187003 DEBUG nova.compute.provider_tree [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.094 187003 DEBUG nova.scheduler.client.report [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.118 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.145 187003 INFO nova.scheduler.client.report [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance a4dcff35-86ac-46bc-939c-dc6316ffd80f
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.199 187003 DEBUG oslo_concurrency.lockutils [None req-b6f98ce1-2536-47b7-a4bf-0879fdce1ac2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.358 187003 DEBUG nova.network.neutron [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updated VIF entry in instance network info cache for port b1da3c12-e629-4325-be7d-3295c80a73da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.359 187003 DEBUG nova.network.neutron [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Updating instance_info_cache with network_info: [{"id": "b1da3c12-e629-4325-be7d-3295c80a73da", "address": "fa:16:3e:92:3f:6b", "network": {"id": "56b453c1-ee78-40be-9431-0afc399d7dbc", "bridge": "br-int", "label": "tempest-network-smoke--12842400", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1da3c12-e6", "ovs_interfaceid": "b1da3c12-e629-4325-be7d-3295c80a73da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.379 187003 DEBUG oslo_concurrency.lockutils [req-85b13028-0ac0-45b7-a274-3297f28fe8cd req-020303f9-1919-4970-a7e7-406ddf2529cb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a4dcff35-86ac-46bc-939c-dc6316ffd80f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.765 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.797 187003 DEBUG nova.compute.manager [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.798 187003 DEBUG oslo_concurrency.lockutils [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.799 187003 DEBUG oslo_concurrency.lockutils [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.799 187003 DEBUG oslo_concurrency.lockutils [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a4dcff35-86ac-46bc-939c-dc6316ffd80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.800 187003 DEBUG nova.compute.manager [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] No waiting events found dispatching network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.800 187003 WARNING nova.compute.manager [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received unexpected event network-vif-plugged-b1da3c12-e629-4325-be7d-3295c80a73da for instance with vm_state deleted and task_state None.
Nov 24 01:58:16 compute-0 nova_compute[186999]: 2025-11-24 01:58:16.801 187003 DEBUG nova.compute.manager [req-05bfa547-7bf0-4d2a-b7b5-86a0f0dd1c3d req-be41410a-8694-449b-8bc3-c73850c14ae6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Received event network-vif-deleted-b1da3c12-e629-4325-be7d-3295c80a73da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:18 compute-0 podman[214086]: 2025-11-24 01:58:18.802376752 +0000 UTC m=+0.056266167 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 01:58:19 compute-0 nova_compute[186999]: 2025-11-24 01:58:19.605 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:20 compute-0 nova_compute[186999]: 2025-11-24 01:58:20.031 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:20 compute-0 nova_compute[186999]: 2025-11-24 01:58:20.127 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:21 compute-0 nova_compute[186999]: 2025-11-24 01:58:21.765 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:21 compute-0 podman[214109]: 2025-11-24 01:58:21.823296206 +0000 UTC m=+0.076287168 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter)
Nov 24 01:58:21 compute-0 nova_compute[186999]: 2025-11-24 01:58:21.897 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949486.8958716, fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:58:21 compute-0 nova_compute[186999]: 2025-11-24 01:58:21.897 187003 INFO nova.compute.manager [-] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] VM Stopped (Lifecycle Event)
Nov 24 01:58:21 compute-0 nova_compute[186999]: 2025-11-24 01:58:21.916 187003 DEBUG nova.compute.manager [None req-4653f585-643c-400e-b526-98256434169a - - - - - -] [instance: fe06e5d2-9fe1-4078-ac8a-550dd2c3d14b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:24 compute-0 nova_compute[186999]: 2025-11-24 01:58:24.607 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:26 compute-0 nova_compute[186999]: 2025-11-24 01:58:26.808 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:28 compute-0 podman[214130]: 2025-11-24 01:58:28.799829679 +0000 UTC m=+0.052853871 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 01:58:29 compute-0 nova_compute[186999]: 2025-11-24 01:58:29.577 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949494.5758426, a4dcff35-86ac-46bc-939c-dc6316ffd80f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:58:29 compute-0 nova_compute[186999]: 2025-11-24 01:58:29.577 187003 INFO nova.compute.manager [-] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] VM Stopped (Lifecycle Event)
Nov 24 01:58:29 compute-0 nova_compute[186999]: 2025-11-24 01:58:29.593 187003 DEBUG nova.compute.manager [None req-cc77702b-9209-45de-a892-c1f114c1b037 - - - - - -] [instance: a4dcff35-86ac-46bc-939c-dc6316ffd80f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:29 compute-0 nova_compute[186999]: 2025-11-24 01:58:29.609 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:31 compute-0 nova_compute[186999]: 2025-11-24 01:58:31.817 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:32 compute-0 podman[214154]: 2025-11-24 01:58:32.80408223 +0000 UTC m=+0.061119923 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 01:58:34 compute-0 nova_compute[186999]: 2025-11-24 01:58:34.610 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:36 compute-0 nova_compute[186999]: 2025-11-24 01:58:36.819 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.510 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.511 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.527 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.608 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.608 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.616 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.616 187003 INFO nova.compute.claims [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Claim successful on node compute-0.ctlplane.example.com
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.720 187003 DEBUG nova.compute.provider_tree [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.736 187003 DEBUG nova.scheduler.client.report [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.755 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.755 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.804 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.805 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 01:58:37 compute-0 podman[214173]: 2025-11-24 01:58:37.816855582 +0000 UTC m=+0.064080405 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.931 187003 INFO nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 01:58:37 compute-0 nova_compute[186999]: 2025-11-24 01:58:37.948 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.037 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.038 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.038 187003 INFO nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Creating image(s)
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.039 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.039 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.040 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.052 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.113 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.114 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.116 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.136 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.187 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.188 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.220 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.221 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.222 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.274 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.275 187003 DEBUG nova.virt.disk.api [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.276 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.336 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.337 187003 DEBUG nova.virt.disk.api [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.338 187003 DEBUG nova.objects.instance [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.349 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.350 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Ensure instance console log exists: /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.350 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.351 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.351 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:38 compute-0 nova_compute[186999]: 2025-11-24 01:58:38.642 187003 DEBUG nova.policy [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 01:58:39 compute-0 nova_compute[186999]: 2025-11-24 01:58:39.612 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:39 compute-0 nova_compute[186999]: 2025-11-24 01:58:39.691 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Successfully created port: 81142ca2-757d-4009-a916-5629cc1bff67 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 01:58:40 compute-0 sshd-session[214209]: Invalid user fiscal from 154.90.59.75 port 37408
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.786 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Successfully updated port: 81142ca2-757d-4009-a916-5629cc1bff67 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 01:58:40 compute-0 podman[214211]: 2025-11-24 01:58:40.800490172 +0000 UTC m=+0.062665516 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.804 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.804 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.805 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:58:40 compute-0 podman[214212]: 2025-11-24 01:58:40.847748056 +0000 UTC m=+0.107319867 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.860 187003 DEBUG nova.compute.manager [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-changed-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.860 187003 DEBUG nova.compute.manager [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing instance network info cache due to event network-changed-81142ca2-757d-4009-a916-5629cc1bff67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.860 187003 DEBUG oslo_concurrency.lockutils [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:58:40 compute-0 nova_compute[186999]: 2025-11-24 01:58:40.932 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 01:58:40 compute-0 sshd-session[214209]: Received disconnect from 154.90.59.75 port 37408:11: Bye Bye [preauth]
Nov 24 01:58:40 compute-0 sshd-session[214209]: Disconnected from invalid user fiscal 154.90.59.75 port 37408 [preauth]
Nov 24 01:58:41 compute-0 nova_compute[186999]: 2025-11-24 01:58:41.820 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:41 compute-0 nova_compute[186999]: 2025-11-24 01:58:41.998 187003 DEBUG nova.network.neutron [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.011 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.012 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance network_info: |[{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.012 187003 DEBUG oslo_concurrency.lockutils [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.012 187003 DEBUG nova.network.neutron [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.017 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Start _get_guest_xml network_info=[{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.023 187003 WARNING nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.030 187003 DEBUG nova.virt.libvirt.host [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.031 187003 DEBUG nova.virt.libvirt.host [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.034 187003 DEBUG nova.virt.libvirt.host [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.035 187003 DEBUG nova.virt.libvirt.host [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.035 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.035 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.036 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.036 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.036 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.036 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.037 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.037 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.037 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.037 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.037 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.038 187003 DEBUG nova.virt.hardware [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.042 187003 DEBUG nova.virt.libvirt.vif [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:58:37Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.042 187003 DEBUG nova.network.os_vif_util [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.043 187003 DEBUG nova.network.os_vif_util [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.043 187003 DEBUG nova.objects.instance [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.054 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] End _get_guest_xml xml=<domain type="kvm">
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <uuid>8a96324d-81f3-42dd-9974-a49392009d7f</uuid>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <name>instance-00000003</name>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 01:58:42</nova:creationTime>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:58:42 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <system>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="serial">8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="uuid">8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </system>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <os>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </os>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <features>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </features>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:42:ef:4d"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <target dev="tap81142ca2-75"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log" append="off"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <video>
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </video>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 01:58:42 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 01:58:42 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:58:42 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:58:42 compute-0 nova_compute[186999]: </domain>
Nov 24 01:58:42 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.054 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Preparing to wait for external event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.055 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.055 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.055 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.056 187003 DEBUG nova.virt.libvirt.vif [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:58:37Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.056 187003 DEBUG nova.network.os_vif_util [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.057 187003 DEBUG nova.network.os_vif_util [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.057 187003 DEBUG os_vif [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.058 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.058 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.058 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.061 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.061 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81142ca2-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.062 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap81142ca2-75, col_values=(('external_ids', {'iface-id': '81142ca2-757d-4009-a916-5629cc1bff67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:ef:4d', 'vm-uuid': '8a96324d-81f3-42dd-9974-a49392009d7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.063 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 NetworkManager[55458]: <info>  [1763949522.0643] manager: (tap81142ca2-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.066 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.071 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.072 187003 INFO os_vif [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75')
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.117 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.117 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.117 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:42:ef:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.118 187003 INFO nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Using config drive
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.722 187003 INFO nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Creating config drive at /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.731 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphslpcy0k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.871 187003 DEBUG oslo_concurrency.processutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphslpcy0k" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:58:42 compute-0 kernel: tap81142ca2-75: entered promiscuous mode
Nov 24 01:58:42 compute-0 NetworkManager[55458]: <info>  [1763949522.9414] manager: (tap81142ca2-75): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 24 01:58:42 compute-0 systemd-udevd[214278]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:58:42 compute-0 ovn_controller[95380]: 2025-11-24T01:58:42Z|00044|binding|INFO|Claiming lport 81142ca2-757d-4009-a916-5629cc1bff67 for this chassis.
Nov 24 01:58:42 compute-0 ovn_controller[95380]: 2025-11-24T01:58:42Z|00045|binding|INFO|81142ca2-757d-4009-a916-5629cc1bff67: Claiming fa:16:3e:42:ef:4d 10.100.0.14
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.977 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 nova_compute[186999]: 2025-11-24 01:58:42.980 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:42.991 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:ef:4d 10.100.0.14'], port_security=['fa:16:3e:42:ef:4d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '586ea056-5f50-47a9-ae5d-2f44abd0c7c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=627b2843-fb67-4184-b3db-30deb84eba89, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=81142ca2-757d-4009-a916-5629cc1bff67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:58:42 compute-0 NetworkManager[55458]: <info>  [1763949522.9921] device (tap81142ca2-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:58:42 compute-0 NetworkManager[55458]: <info>  [1763949522.9930] device (tap81142ca2-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 01:58:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:42.993 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 81142ca2-757d-4009-a916-5629cc1bff67 in datapath f2383360-95a5-4b5a-9aa4-a99b489f9cee bound to our chassis
Nov 24 01:58:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:42.994 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2383360-95a5-4b5a-9aa4-a99b489f9cee
Nov 24 01:58:43 compute-0 systemd-machined[153319]: New machine qemu-3-instance-00000003.
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.008 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0e97ad95-a2f9-4496-bd96-9f12333b5470]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.009 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2383360-91 in ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.011 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2383360-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.011 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6044a37a-d395-4e48-abca-86190f89cffe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.012 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[47294781-b8fe-4b5a-a328-7bed99ef0bbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.024 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce2445d-9908-4ba1-a8eb-5951b67b73fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_controller[95380]: 2025-11-24T01:58:43Z|00046|binding|INFO|Setting lport 81142ca2-757d-4009-a916-5629cc1bff67 ovn-installed in OVS
Nov 24 01:58:43 compute-0 ovn_controller[95380]: 2025-11-24T01:58:43Z|00047|binding|INFO|Setting lport 81142ca2-757d-4009-a916-5629cc1bff67 up in Southbound
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.037 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[cb370817-7cbc-43ab-a6cb-ec58f8748998]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.038 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:43 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.070 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[579fdf30-baa3-4f3c-9ad7-3e934b4d012e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 NetworkManager[55458]: <info>  [1763949523.0777] manager: (tapf2383360-90): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.077 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[71a3112b-5803-45ba-8a46-68d3e0d96835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.108 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[3499c971-588d-46c4-813c-cdce365aac75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.111 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9148a8-1610-4fcb-988f-afb218bc384e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 NetworkManager[55458]: <info>  [1763949523.1361] device (tapf2383360-90): carrier: link connected
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.141 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[64aa3a37-4b4f-45a6-aaa6-1c8ebe248cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.157 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[39cf2c7c-420f-4a6f-b4e7-e1658086d558]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2383360-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:33:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 298979, 'reachable_time': 41348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214314, 'error': None, 'target': 'ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.172 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1488f6fc-d6e2-46b0-9493-9f07eca5113c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:3337'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 298979, 'tstamp': 298979}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214315, 'error': None, 'target': 'ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.186 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6e15c66c-b0c2-4f19-8f22-06ebf877398f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2383360-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:33:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 298979, 'reachable_time': 41348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214317, 'error': None, 'target': 'ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.213 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[db2f96f0-6153-41c2-9b82-fe0754534295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.265 187003 DEBUG nova.compute.manager [req-0185c05a-f7dd-4b7d-95f3-e3f188d0d848 req-098af933-4c8d-4fc8-8639-3e268f7415b0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.266 187003 DEBUG oslo_concurrency.lockutils [req-0185c05a-f7dd-4b7d-95f3-e3f188d0d848 req-098af933-4c8d-4fc8-8639-3e268f7415b0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.266 187003 DEBUG oslo_concurrency.lockutils [req-0185c05a-f7dd-4b7d-95f3-e3f188d0d848 req-098af933-4c8d-4fc8-8639-3e268f7415b0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.267 187003 DEBUG oslo_concurrency.lockutils [req-0185c05a-f7dd-4b7d-95f3-e3f188d0d848 req-098af933-4c8d-4fc8-8639-3e268f7415b0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.267 187003 DEBUG nova.compute.manager [req-0185c05a-f7dd-4b7d-95f3-e3f188d0d848 req-098af933-4c8d-4fc8-8639-3e268f7415b0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Processing event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.272 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3a494063-ca1a-4529-9f8a-3d3f43f057ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.273 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2383360-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.274 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.274 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2383360-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.276 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:43 compute-0 NetworkManager[55458]: <info>  [1763949523.2766] manager: (tapf2383360-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 24 01:58:43 compute-0 kernel: tapf2383360-90: entered promiscuous mode
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.278 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.279 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2383360-90, col_values=(('external_ids', {'iface-id': '309d406e-bca9-4d02-bf94-de7a1bfd7dea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.280 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:43 compute-0 ovn_controller[95380]: 2025-11-24T01:58:43Z|00048|binding|INFO|Releasing lport 309d406e-bca9-4d02-bf94-de7a1bfd7dea from this chassis (sb_readonly=0)
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.296 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.298 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2383360-95a5-4b5a-9aa4-a99b489f9cee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2383360-95a5-4b5a-9aa4-a99b489f9cee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.299 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ca8866-e731-441f-8963-4ac4fc09343e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.299 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: global
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-f2383360-95a5-4b5a-9aa4-a99b489f9cee
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/f2383360-95a5-4b5a-9aa4-a99b489f9cee.pid.haproxy
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID f2383360-95a5-4b5a-9aa4-a99b489f9cee
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 01:58:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:43.300 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'env', 'PROCESS_TAG=haproxy-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2383360-95a5-4b5a-9aa4-a99b489f9cee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.369 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.369 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949523.3686101, 8a96324d-81f3-42dd-9974-a49392009d7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.370 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] VM Started (Lifecycle Event)
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.379 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.386 187003 INFO nova.virt.libvirt.driver [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance spawned successfully.
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.387 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.394 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.399 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.404 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.405 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.405 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.406 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.406 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.411 187003 DEBUG nova.virt.libvirt.driver [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.417 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.417 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949523.3695183, 8a96324d-81f3-42dd-9974-a49392009d7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.418 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] VM Paused (Lifecycle Event)
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.430 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.433 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949523.3774312, 8a96324d-81f3-42dd-9974-a49392009d7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.434 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] VM Resumed (Lifecycle Event)
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.447 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.450 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.468 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.493 187003 INFO nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Took 5.46 seconds to spawn the instance on the hypervisor.
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.493 187003 DEBUG nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.550 187003 INFO nova.compute.manager [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Took 5.97 seconds to build instance.
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.576 187003 DEBUG oslo_concurrency.lockutils [None req-c93d5e7e-8edd-4860-9ef8-6f2ee75393bf e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.618 187003 DEBUG nova.network.neutron [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updated VIF entry in instance network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.618 187003 DEBUG nova.network.neutron [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:43 compute-0 nova_compute[186999]: 2025-11-24 01:58:43.644 187003 DEBUG oslo_concurrency.lockutils [req-47b70d22-ea3c-43f9-bdeb-4387a6841bda req-f0a79ff8-e617-45ad-82bb-a53170ffd5b6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:58:43 compute-0 podman[214355]: 2025-11-24 01:58:43.669662378 +0000 UTC m=+0.045563167 container create 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 01:58:43 compute-0 systemd[1]: Started libpod-conmon-834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495.scope.
Nov 24 01:58:43 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:58:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d370607bb3a132adea369a949871949a3668679e9304ea0c38231c003f347fa4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:58:43 compute-0 podman[214355]: 2025-11-24 01:58:43.743527597 +0000 UTC m=+0.119428406 container init 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 01:58:43 compute-0 podman[214355]: 2025-11-24 01:58:43.648092624 +0000 UTC m=+0.023993433 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:58:43 compute-0 podman[214355]: 2025-11-24 01:58:43.750814691 +0000 UTC m=+0.126715480 container start 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 24 01:58:43 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [NOTICE]   (214375) : New worker (214377) forked
Nov 24 01:58:43 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [NOTICE]   (214375) : Loading success.
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.354 187003 DEBUG nova.compute.manager [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.355 187003 DEBUG oslo_concurrency.lockutils [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.355 187003 DEBUG oslo_concurrency.lockutils [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.355 187003 DEBUG oslo_concurrency.lockutils [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.355 187003 DEBUG nova.compute.manager [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.355 187003 WARNING nova.compute.manager [req-aca53740-e516-4f49-80c8-b4c1c48b161f req-67b99d03-27ca-4232-a6db-03f99e6641c8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 for instance with vm_state active and task_state None.
Nov 24 01:58:45 compute-0 nova_compute[186999]: 2025-11-24 01:58:45.356 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:45.359 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:58:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:45.362 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 01:58:45 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:45.365 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:58:46 compute-0 ovn_controller[95380]: 2025-11-24T01:58:46Z|00049|binding|INFO|Releasing lport 309d406e-bca9-4d02-bf94-de7a1bfd7dea from this chassis (sb_readonly=0)
Nov 24 01:58:46 compute-0 nova_compute[186999]: 2025-11-24 01:58:46.012 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:46 compute-0 NetworkManager[55458]: <info>  [1763949526.0127] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 24 01:58:46 compute-0 NetworkManager[55458]: <info>  [1763949526.0137] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 24 01:58:46 compute-0 ovn_controller[95380]: 2025-11-24T01:58:46Z|00050|binding|INFO|Releasing lport 309d406e-bca9-4d02-bf94-de7a1bfd7dea from this chassis (sb_readonly=0)
Nov 24 01:58:46 compute-0 nova_compute[186999]: 2025-11-24 01:58:46.042 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:46 compute-0 nova_compute[186999]: 2025-11-24 01:58:46.046 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:46 compute-0 nova_compute[186999]: 2025-11-24 01:58:46.822 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.064 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.438 187003 DEBUG nova.compute.manager [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-changed-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.439 187003 DEBUG nova.compute.manager [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing instance network info cache due to event network-changed-81142ca2-757d-4009-a916-5629cc1bff67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.440 187003 DEBUG oslo_concurrency.lockutils [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.440 187003 DEBUG oslo_concurrency.lockutils [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:58:47 compute-0 nova_compute[186999]: 2025-11-24 01:58:47.441 187003 DEBUG nova.network.neutron [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:58:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:48.419 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:58:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:48.420 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:58:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:58:48.421 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:58:49 compute-0 podman[214388]: 2025-11-24 01:58:49.808832108 +0000 UTC m=+0.063458321 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 01:58:51 compute-0 nova_compute[186999]: 2025-11-24 01:58:51.840 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:52 compute-0 nova_compute[186999]: 2025-11-24 01:58:52.065 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:52 compute-0 nova_compute[186999]: 2025-11-24 01:58:52.592 187003 DEBUG nova.network.neutron [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updated VIF entry in instance network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:58:52 compute-0 nova_compute[186999]: 2025-11-24 01:58:52.592 187003 DEBUG nova.network.neutron [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:58:52 compute-0 nova_compute[186999]: 2025-11-24 01:58:52.615 187003 DEBUG oslo_concurrency.lockutils [req-a3769f42-de55-47c9-874b-a4a7d76e4291 req-7f2fe50f-aee0-4e71-9618-a10f2a6719d3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:58:52 compute-0 podman[214409]: 2025-11-24 01:58:52.830074847 +0000 UTC m=+0.082794282 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64)
Nov 24 01:58:56 compute-0 ovn_controller[95380]: 2025-11-24T01:58:56Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:ef:4d 10.100.0.14
Nov 24 01:58:56 compute-0 ovn_controller[95380]: 2025-11-24T01:58:56Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:ef:4d 10.100.0.14
Nov 24 01:58:56 compute-0 nova_compute[186999]: 2025-11-24 01:58:56.848 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:57 compute-0 nova_compute[186999]: 2025-11-24 01:58:57.066 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:58:58 compute-0 sshd-session[214387]: Invalid user orangepi from 68.210.96.117 port 51142
Nov 24 01:58:59 compute-0 sshd-session[214387]: Connection closed by invalid user orangepi 68.210.96.117 port 51142 [preauth]
Nov 24 01:58:59 compute-0 podman[214447]: 2025-11-24 01:58:59.804036258 +0000 UTC m=+0.055001529 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 01:59:01 compute-0 nova_compute[186999]: 2025-11-24 01:59:01.874 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:02 compute-0 nova_compute[186999]: 2025-11-24 01:59:02.069 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:03 compute-0 nova_compute[186999]: 2025-11-24 01:59:03.195 187003 INFO nova.compute.manager [None req-5da0a1c8-931a-4b7d-a5b3-457e2ff11fd6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Get console output
Nov 24 01:59:03 compute-0 nova_compute[186999]: 2025-11-24 01:59:03.202 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 01:59:03 compute-0 podman[214471]: 2025-11-24 01:59:03.790751033 +0000 UTC m=+0.047245946 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:59:06 compute-0 nova_compute[186999]: 2025-11-24 01:59:06.465 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:06 compute-0 nova_compute[186999]: 2025-11-24 01:59:06.466 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:06 compute-0 nova_compute[186999]: 2025-11-24 01:59:06.466 187003 DEBUG nova.objects.instance [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'flavor' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:06 compute-0 nova_compute[186999]: 2025-11-24 01:59:06.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:06 compute-0 nova_compute[186999]: 2025-11-24 01:59:06.876 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:07 compute-0 nova_compute[186999]: 2025-11-24 01:59:07.071 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:07 compute-0 nova_compute[186999]: 2025-11-24 01:59:07.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:07 compute-0 nova_compute[186999]: 2025-11-24 01:59:07.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 01:59:07 compute-0 nova_compute[186999]: 2025-11-24 01:59:07.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.604 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.604 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.604 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.604 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.649 187003 DEBUG nova.objects.instance [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:08 compute-0 nova_compute[186999]: 2025-11-24 01:59:08.661 187003 DEBUG nova.network.neutron [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 01:59:08 compute-0 podman[214491]: 2025-11-24 01:59:08.800694467 +0000 UTC m=+0.052874881 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 01:59:09 compute-0 nova_compute[186999]: 2025-11-24 01:59:09.644 187003 DEBUG nova.policy [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.083 187003 DEBUG nova.network.neutron [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Successfully created port: a01679b1-1507-45f5-ab78-92e9417de5c7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.250 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'name': 'tempest-TestNetworkBasicOps-server-407998524', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'hostId': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.281 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.latency volume: 53203390182 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.281 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60874ae2-b2d2-478f-8e5d-98ba93de0df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53203390182, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.251352', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b9036ec-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '475d48fe4eec68b8be36b7291c1c23be2c899d987dfd5814b165bd167721d69f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.251352', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b90465a-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '7fbcbdba1da88434ab4d478ec0aa62f8268ce75f2751c3bdbb908aeb0ea77e2a'}]}, 'timestamp': '2025-11-24 01:59:11.282079', '_unique_id': '00534de8ad234f85aa9c56ff011a081a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.283 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.293 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.294 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a203f952-db51-4b4d-b699-6fc87db7833b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.284299', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b922088-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': 'f3a1d63a30ca482a0877fe2906eb05dbcb3801b6504fb360a39e57f2728248c5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.284299', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b922d4e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': '931be48d6565eb756f1f8ac6806eea825049d0101dfa9a49e71e89bdcc00fe2f'}]}, 'timestamp': '2025-11-24 01:59:11.294538', '_unique_id': 'ac299dbff5c24ca4a87a3c1484e435fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.295 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.296 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.296 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>]
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.298 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8a96324d-81f3-42dd-9974-a49392009d7f / tap81142ca2-75 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.299 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cdaba84-49cf-4653-b9d1-04dee07d0c47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.296467', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b92ef72-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '75f59e7f09e62594e43389f4d17b2136b229b689064b105393c673ab3dbcf817'}]}, 'timestamp': '2025-11-24 01:59:11.299508', '_unique_id': 'b6174e1577ca4daea2285475f0fa3813'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.bytes volume: 72949760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '738196fa-d6fd-47b6-a32c-db3b2abebdb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72949760, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.301029', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b93354a-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '22fd77230f8e32c1d074f09cef8eae343931eedb2e94aca3e29eb5df76fb6ac7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.301029', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b933d42-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '6500cb4d3abe9d3083998d6f45d8071e488670fa29ea86052963518c9b798397'}]}, 'timestamp': '2025-11-24 01:59:11.301450', '_unique_id': '4db474672e9c49759695cad0ac130d7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.301 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.302 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.302 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.bytes volume: 30489088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.302 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df345ba6-9572-4beb-a739-48f8846a6217', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30489088, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.302606', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b937442-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '6cbefbeee65d560b3d6eeffabb98ef81fd24db10704cc349e01258e791dfd86d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.302606', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b937cee-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '62c0fff9e71a466eb4536142106c1a1d9c7da819e3d81884fdfae7d4dda92544'}]}, 'timestamp': '2025-11-24 01:59:11.303080', '_unique_id': 'd9e1b611731d4985860606816f4b563b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.303 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>]
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.304 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3e1b3ef-82de-4879-962b-5de39914a2e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.304560', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b93bf24-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': 'c081f5d92c181c527cffed000e30863bd7066b61fc1ebc976e732c58f69768d8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.304560', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b93c7bc-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': 'ed58e1a5cc2f29ff46cd96a5a8c7964f45c2a415d1f068738c10f8da2672049e'}]}, 'timestamp': '2025-11-24 01:59:11.304997', '_unique_id': '2a6e4586dcce4c98944840d030c7a063'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.306 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df1e10ff-95f3-4f44-a0d1-36e7c77593b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.306315', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b94045c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': 'cef69034b7165f5b7ddef5bfac3e725611d7152994863c1391c563c1828cff74'}]}, 'timestamp': '2025-11-24 01:59:11.306577', '_unique_id': 'e1301a59a5d5498990c0a95fa191c72c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.307 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>]
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.308 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.outgoing.packets volume: 56 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb0a94c5-e923-4e8b-8a60-eb17a37d7466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 56, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.308138', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b944be2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '6b49f07159b9bcb64ec256e6d0df08da713e425f3aa055a6894fabb370197bd3'}]}, 'timestamp': '2025-11-24 01:59:11.308433', '_unique_id': '51ea4fc89c8e447fb00e8446593cdc7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.309 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.323 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/cpu volume: 10770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972eafb3-3ed9-4e87-81fa-1782a590a403', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10770000000, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'timestamp': '2025-11-24T01:59:11.309665', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2b96b472-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.039813983, 'message_signature': 'f12a46ea099e257d360353abe0093591eb4ee673fc4a8d3fe7153246b32a2839'}]}, 'timestamp': '2025-11-24 01:59:11.324233', '_unique_id': '966908085be243f8b1cc7d3d061e8829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.325 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.outgoing.bytes volume: 8266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac5f3c2-8b54-4fc0-8a43-21bd5c485f65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8266, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.325712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b96fc0c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '65ec3a1522d4fccb5a211416720f8a0b48611d5cf870de47086442adba59a3e7'}]}, 'timestamp': '2025-11-24 01:59:11.326018', '_unique_id': 'c96f984990224c9ab3385bd33a7a38a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.326 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd132be9-9372-4e49-a59f-7351e14c7e22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.327158', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b9731a4-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '0ea605c77548c13052ab37e0af15fcc62d8b188144473beaf66ab203633aff96'}]}, 'timestamp': '2025-11-24 01:59:11.327383', '_unique_id': '92cb16b86d4d47aeade511e2e06797f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.327 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.328 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.incoming.bytes volume: 10359 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90ccf2bc-8de1-41a5-9ebe-5a3f60f94c8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10359, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.328445', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b9763fe-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': 'e42d8f0f876dba02ed9e8616a873584b36504e86ecc508297a5b7f09b3be5861'}]}, 'timestamp': '2025-11-24 01:59:11.328671', '_unique_id': 'c75eeef740854ec38d78352fc9c2a4dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.329 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f45f4029-8191-453d-9b94-75a177c427e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.329838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b979b4e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '3c96218541582694b556c08d9236e1577196b1eba1d688cc4a04493bfbf177ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.329838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b97a4cc-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': '8dc53778ae2e48f7eadbccd52831e6d7162facc823a68eefb032bab3a6be9896'}]}, 'timestamp': '2025-11-24 01:59:11.330316', '_unique_id': '69e71ee813c443cb9a26adf82ba7a853'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.330 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.331 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.incoming.packets volume: 59 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e121927e-9e2a-4d2a-a199-f974b5451a1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 59, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.331540', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b97dd8e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '4e0f405295cef94644b3234bc409b0ba74dc8c273d45f1d20b7b46de6b9e3b71'}]}, 'timestamp': '2025-11-24 01:59:11.331839', '_unique_id': '34418d064c024d8b8f784083e3b05300'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.333 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '365156c3-34d5-4470-84b4-197c888f0ab2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.333311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b982258-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '9b41a4342b8666eea01b218f18b0cfa3189a257b5504069f14f0fb812f5fd224'}]}, 'timestamp': '2025-11-24 01:59:11.333549', '_unique_id': 'f257c0cab1da4fa6a852623e2b9e2622'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.334 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.335 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-407998524>]
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.335 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.335 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.335 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cd4d129-3731-42fc-84bd-0f220039cf16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.335251', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b986dee-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': 'f6c333e5ce772f5c6ec16dd76c37586229576e559a9979a7d4d6b2e5c9c47496'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.335251', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b9875fa-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': 'f70f7f23a6879a6a914b42cd63149b96ae0434bac16af4df8504d5b39e7df4a0'}]}, 'timestamp': '2025-11-24 01:59:11.335670', '_unique_id': '86effa346dc34d6da0ae53bb0f1fc9c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.336 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.latency volume: 704506039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.read.latency volume: 77702848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73dcdd88-376f-494a-b057-78dbe1d82a6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 704506039, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.336901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b98af66-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': 'c6840127acf3c82aca9b5ee118f34f1e10471a7f16e66a5823dce424507ad536'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 77702848, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.336901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b98b97a-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3017.967542491, 'message_signature': 'fb6943eaca3ab3a4d0bbaf4f2e6a420d68fc6be191e224a400b1bcf5eff78694'}]}, 'timestamp': '2025-11-24 01:59:11.337400', '_unique_id': 'a1e3528773494d7a89e03e07258e8d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.338 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.338 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.338 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a73dda9-08c8-4001-880b-b1ced362db3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-vda', 'timestamp': '2025-11-24T01:59:11.338596', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b98f0fc-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': '5c0f371f769bde4fdf2edbaeee6618cea1c9e2132d8585a84c8d53b2d7cf5219'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f-sda', 'timestamp': '2025-11-24T01:59:11.338596', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b98fbe2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.000502035, 'message_signature': '8b4ed98473334956afcb00b3512ae728f5b3be43cdac7ddda722dcaff5e74ad6'}]}, 'timestamp': '2025-11-24 01:59:11.339101', '_unique_id': '5426cffed85549dd8148cc4a9ec7b324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.340 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/memory.usage volume: 42.55859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4473cf00-e978-4d4a-954d-ee33427b1784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.55859375, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'timestamp': '2025-11-24T01:59:11.340278', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'instance-00000003', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2b993256-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.039813983, 'message_signature': '5b9b4634cf0d8e48efa6a9b09d7c3350e47a1d6a5e9e182138a9cd43607348e5'}]}, 'timestamp': '2025-11-24 01:59:11.340516', '_unique_id': '1f5d6c6685ad459aa166532c757d8ae8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.341 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5188ebd5-042b-41a0-8732-be971738222b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.341684', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b996a3c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': 'cd0195bf2aa9009a7d380010ff3003944ffabc7b6db5a48722bdcd70210ac349'}]}, 'timestamp': '2025-11-24 01:59:11.342121', '_unique_id': '40515164d7a54d7e9bfba671ff842682'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.343 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.343 12 DEBUG ceilometer.compute.pollsters [-] 8a96324d-81f3-42dd-9974-a49392009d7f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f31f9b47-e791-43e1-ba18-5e41797d88a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000003-8a96324d-81f3-42dd-9974-a49392009d7f-tap81142ca2-75', 'timestamp': '2025-11-24T01:59:11.343382', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-407998524', 'name': 'tap81142ca2-75', 'instance_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:ef:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81142ca2-75'}, 'message_id': '2b99ab6e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3018.012645528, 'message_signature': '93d3d3fd9b8c416b00af6b177a89835bd0374c2832e787cec8bfe322a665be43'}]}, 'timestamp': '2025-11-24 01:59:11.343606', '_unique_id': '3b31021ac6e344a9bdb7a44acbfd577a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 01:59:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 01:59:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.393 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.404 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.405 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.405 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.405 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.406 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.406 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.406 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.431 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.431 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.432 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.432 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.501 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:11 compute-0 podman[214512]: 2025-11-24 01:59:11.523503293 +0000 UTC m=+0.052555813 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.560 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:11 compute-0 podman[214513]: 2025-11-24 01:59:11.561657659 +0000 UTC m=+0.088001974 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.562 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.616 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.758 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.759 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5612MB free_disk=73.43324279785156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.759 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.760 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.818 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 8a96324d-81f3-42dd-9974-a49392009d7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.819 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.819 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.852 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.865 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.886 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.887 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:11 compute-0 nova_compute[186999]: 2025-11-24 01:59:11.915 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:12 compute-0 nova_compute[186999]: 2025-11-24 01:59:12.072 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.614 187003 DEBUG nova.network.neutron [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Successfully updated port: a01679b1-1507-45f5-ab78-92e9417de5c7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.639 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.640 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.640 187003 DEBUG nova.network.neutron [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.719 187003 DEBUG nova.compute.manager [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-changed-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.720 187003 DEBUG nova.compute.manager [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing instance network info cache due to event network-changed-a01679b1-1507-45f5-ab78-92e9417de5c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.720 187003 DEBUG oslo_concurrency.lockutils [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:13 compute-0 nova_compute[186999]: 2025-11-24 01:59:13.882 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:14 compute-0 nova_compute[186999]: 2025-11-24 01:59:14.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 01:59:14 compute-0 nova_compute[186999]: 2025-11-24 01:59:14.770 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 01:59:14 compute-0 sshd-session[214562]: Invalid user sammy from 46.188.119.26 port 36460
Nov 24 01:59:14 compute-0 sshd-session[214562]: Received disconnect from 46.188.119.26 port 36460:11: Bye Bye [preauth]
Nov 24 01:59:14 compute-0 sshd-session[214562]: Disconnected from invalid user sammy 46.188.119.26 port 36460 [preauth]
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.233 187003 DEBUG nova.network.neutron [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.255 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.256 187003 DEBUG oslo_concurrency.lockutils [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.256 187003 DEBUG nova.network.neutron [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing network info cache for port a01679b1-1507-45f5-ab78-92e9417de5c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.260 187003 DEBUG nova.virt.libvirt.vif [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.260 187003 DEBUG nova.network.os_vif_util [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.261 187003 DEBUG nova.network.os_vif_util [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.261 187003 DEBUG os_vif [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.262 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.262 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.263 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.266 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.266 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa01679b1-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.267 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa01679b1-15, col_values=(('external_ids', {'iface-id': 'a01679b1-1507-45f5-ab78-92e9417de5c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:ef:43', 'vm-uuid': '8a96324d-81f3-42dd-9974-a49392009d7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.269 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.2713] manager: (tapa01679b1-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.278 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.281 187003 INFO os_vif [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15')
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.282 187003 DEBUG nova.virt.libvirt.vif [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.282 187003 DEBUG nova.network.os_vif_util [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.283 187003 DEBUG nova.network.os_vif_util [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.287 187003 DEBUG nova.virt.libvirt.guest [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] attach device xml: <interface type="ethernet">
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:2e:ef:43"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <target dev="tapa01679b1-15"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]: </interface>
Nov 24 01:59:15 compute-0 nova_compute[186999]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 24 01:59:15 compute-0 kernel: tapa01679b1-15: entered promiscuous mode
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.3033] manager: (tapa01679b1-15): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 24 01:59:15 compute-0 ovn_controller[95380]: 2025-11-24T01:59:15Z|00051|binding|INFO|Claiming lport a01679b1-1507-45f5-ab78-92e9417de5c7 for this chassis.
Nov 24 01:59:15 compute-0 ovn_controller[95380]: 2025-11-24T01:59:15Z|00052|binding|INFO|a01679b1-1507-45f5-ab78-92e9417de5c7: Claiming fa:16:3e:2e:ef:43 10.100.0.22
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.305 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.309 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.315 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ef:43 10.100.0.22'], port_security=['fa:16:3e:2e:ef:43 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=307cd83f-c0cf-442e-bc72-70d62a2c9124, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a01679b1-1507-45f5-ab78-92e9417de5c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.316 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a01679b1-1507-45f5-ab78-92e9417de5c7 in datapath ee0e8153-ce03-4d00-ba76-00ea43cfdc08 bound to our chassis
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.317 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee0e8153-ce03-4d00-ba76-00ea43cfdc08
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.330 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[01571f34-0220-4643-8f5e-4e21d0515128]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.331 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee0e8153-c1 in ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.332 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee0e8153-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.332 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[700acd12-4cbb-4041-8a82-cb44a7541ffc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.333 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[caa561bb-9b41-4d24-8c0c-05cd2a4368f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_controller[95380]: 2025-11-24T01:59:15Z|00053|binding|INFO|Setting lport a01679b1-1507-45f5-ab78-92e9417de5c7 ovn-installed in OVS
Nov 24 01:59:15 compute-0 ovn_controller[95380]: 2025-11-24T01:59:15Z|00054|binding|INFO|Setting lport a01679b1-1507-45f5-ab78-92e9417de5c7 up in Southbound
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.365 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.349 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[596925f0-5959-4546-964c-cb8aefa2555e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 systemd-udevd[214573]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.3891] device (tapa01679b1-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.3907] device (tapa01679b1-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.394 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[68d03957-b707-4e84-957b-317c58280191]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.409 187003 DEBUG nova.virt.libvirt.driver [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.409 187003 DEBUG nova.virt.libvirt.driver [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.409 187003 DEBUG nova.virt.libvirt.driver [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:42:ef:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.410 187003 DEBUG nova.virt.libvirt.driver [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:2e:ef:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.423 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd1ba86-9037-49f2-84e4-aae3fa6e2cc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.430 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3fecff1e-e8fa-4ab8-ad22-2278ed9bcfe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.4313] manager: (tapee0e8153-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Nov 24 01:59:15 compute-0 systemd-udevd[214575]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.440 187003 DEBUG nova.virt.libvirt.guest [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:15</nova:creationTime>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:15 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     <nova:port uuid="a01679b1-1507-45f5-ab78-92e9417de5c7">
Nov 24 01:59:15 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 24 01:59:15 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:15 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:15 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:15 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.461 187003 DEBUG oslo_concurrency.lockutils [None req-7e5d713c-a66c-421a-ae40-29a3d78c04e6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.471 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[555d4c83-3dc0-4eeb-a187-d4176be90057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.474 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f603ee-f179-4c89-935e-cabc15b97113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.5089] device (tapee0e8153-c0): carrier: link connected
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.514 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[05390825-83c4-4055-8e9e-ae9164fe79ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.543 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b8491aa8-0cd3-4c93-8312-95589327e417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee0e8153-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:bf:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 302217, 'reachable_time': 20984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214597, 'error': None, 'target': 'ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.556 187003 DEBUG nova.compute.manager [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.556 187003 DEBUG oslo_concurrency.lockutils [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.557 187003 DEBUG oslo_concurrency.lockutils [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.557 187003 DEBUG oslo_concurrency.lockutils [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.557 187003 DEBUG nova.compute.manager [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.558 187003 WARNING nova.compute.manager [req-630855bb-ea42-48e8-ab70-d80427142442 req-57176a46-6341-4a1c-a140-4bcd22b2055e 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 for instance with vm_state active and task_state None.
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.562 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[40b01367-3a98-4a6e-9083-07f0eff59e54]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:bf9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 302217, 'tstamp': 302217}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214598, 'error': None, 'target': 'ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.592 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[5f11c833-7858-4ec2-a491-59f0d5111790]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee0e8153-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:bf:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 302217, 'reachable_time': 20984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214599, 'error': None, 'target': 'ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.634 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9cff76fa-8575-4ef6-ba10-9114bc0d7a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.719 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3807f3-5b36-4685-9ba8-51dfa8b27dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.720 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee0e8153-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.721 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.721 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee0e8153-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 NetworkManager[55458]: <info>  [1763949555.7247] manager: (tapee0e8153-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Nov 24 01:59:15 compute-0 kernel: tapee0e8153-c0: entered promiscuous mode
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.723 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.726 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.727 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee0e8153-c0, col_values=(('external_ids', {'iface-id': '18fb75eb-ac12-4a7f-8ceb-6a24ee4f6c91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.729 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 ovn_controller[95380]: 2025-11-24T01:59:15Z|00055|binding|INFO|Releasing lport 18fb75eb-ac12-4a7f-8ceb-6a24ee4f6c91 from this chassis (sb_readonly=0)
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.730 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.730 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee0e8153-ce03-4d00-ba76-00ea43cfdc08.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee0e8153-ce03-4d00-ba76-00ea43cfdc08.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.731 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf9feb0-a9a0-4c36-842c-4dffdf3cd508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.733 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: global
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-ee0e8153-ce03-4d00-ba76-00ea43cfdc08
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/ee0e8153-ce03-4d00-ba76-00ea43cfdc08.pid.haproxy
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID ee0e8153-ce03-4d00-ba76-00ea43cfdc08
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 01:59:15 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:15.734 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'env', 'PROCESS_TAG=haproxy-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee0e8153-ce03-4d00-ba76-00ea43cfdc08.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 01:59:15 compute-0 nova_compute[186999]: 2025-11-24 01:59:15.742 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:16 compute-0 podman[214631]: 2025-11-24 01:59:16.117688975 +0000 UTC m=+0.048495591 container create 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 01:59:16 compute-0 systemd[1]: Started libpod-conmon-86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb.scope.
Nov 24 01:59:16 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:59:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a86960dcf5f4494c23ab6334f2c2e2b73fd6350bcdd6e9420463c735303c7c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:59:16 compute-0 podman[214631]: 2025-11-24 01:59:16.092459163 +0000 UTC m=+0.023265789 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:59:16 compute-0 podman[214631]: 2025-11-24 01:59:16.194787529 +0000 UTC m=+0.125594155 container init 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 01:59:16 compute-0 podman[214631]: 2025-11-24 01:59:16.201985056 +0000 UTC m=+0.132791662 container start 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 01:59:16 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [NOTICE]   (214652) : New worker (214654) forked
Nov 24 01:59:16 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [NOTICE]   (214652) : Loading success.
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.593 187003 DEBUG nova.network.neutron [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updated VIF entry in instance network info cache for port a01679b1-1507-45f5-ab78-92e9417de5c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.594 187003 DEBUG nova.network.neutron [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.610 187003 DEBUG oslo_concurrency.lockutils [req-42cdb434-26da-4d16-926c-87a1391e24aa req-f92bf44b-e858-4823-b1c6-7393d8eb4c9d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.729 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-a01679b1-1507-45f5-ab78-92e9417de5c7" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.730 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-a01679b1-1507-45f5-ab78-92e9417de5c7" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.746 187003 DEBUG nova.objects.instance [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'flavor' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.768 187003 DEBUG nova.virt.libvirt.vif [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.768 187003 DEBUG nova.network.os_vif_util [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.769 187003 DEBUG nova.network.os_vif_util [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.773 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.776 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.780 187003 DEBUG nova.virt.libvirt.driver [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Attempting to detach device tapa01679b1-15 from instance 8a96324d-81f3-42dd-9974-a49392009d7f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.780 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] detach device xml: <interface type="ethernet">
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:2e:ef:43"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <target dev="tapa01679b1-15"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </interface>
Nov 24 01:59:16 compute-0 nova_compute[186999]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.788 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.792 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <name>instance-00000003</name>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <uuid>8a96324d-81f3-42dd-9974-a49392009d7f</uuid>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:15</nova:creationTime>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:port uuid="a01679b1-1507-45f5-ab78-92e9417de5c7">
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <resource>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </resource>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <system>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='serial'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='uuid'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </system>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <os>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </os>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <features>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </features>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk' index='2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config' index='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:42:ef:4d'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='tap81142ca2-75'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:2e:ef:43'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='tapa01679b1-15'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='net1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       </target>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </console>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <video>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </video>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c349,c991</label>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c349,c991</imagelabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </domain>
Nov 24 01:59:16 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.792 187003 INFO nova.virt.libvirt.driver [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully detached device tapa01679b1-15 from instance 8a96324d-81f3-42dd-9974-a49392009d7f from the persistent domain config.
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.792 187003 DEBUG nova.virt.libvirt.driver [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] (1/8): Attempting to detach device tapa01679b1-15 with device alias net1 from instance 8a96324d-81f3-42dd-9974-a49392009d7f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.793 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] detach device xml: <interface type="ethernet">
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:2e:ef:43"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <target dev="tapa01679b1-15"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </interface>
Nov 24 01:59:16 compute-0 nova_compute[186999]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 01:59:16 compute-0 kernel: tapa01679b1-15 (unregistering): left promiscuous mode
Nov 24 01:59:16 compute-0 NetworkManager[55458]: <info>  [1763949556.8890] device (tapa01679b1-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 01:59:16 compute-0 ovn_controller[95380]: 2025-11-24T01:59:16Z|00056|binding|INFO|Releasing lport a01679b1-1507-45f5-ab78-92e9417de5c7 from this chassis (sb_readonly=0)
Nov 24 01:59:16 compute-0 ovn_controller[95380]: 2025-11-24T01:59:16Z|00057|binding|INFO|Setting lport a01679b1-1507-45f5-ab78-92e9417de5c7 down in Southbound
Nov 24 01:59:16 compute-0 ovn_controller[95380]: 2025-11-24T01:59:16Z|00058|binding|INFO|Removing iface tapa01679b1-15 ovn-installed in OVS
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.897 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.900 187003 DEBUG nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Received event <DeviceRemovedEvent: 1763949556.900046, 8a96324d-81f3-42dd-9974-a49392009d7f => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.901 187003 DEBUG nova.virt.libvirt.driver [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Start waiting for the detach event from libvirt for device tapa01679b1-15 with device alias net1 for instance 8a96324d-81f3-42dd-9974-a49392009d7f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.902 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:16.903 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ef:43 10.100.0.22'], port_security=['fa:16:3e:2e:ef:43 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=307cd83f-c0cf-442e-bc72-70d62a2c9124, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a01679b1-1507-45f5-ab78-92e9417de5c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:59:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:16.905 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a01679b1-1507-45f5-ab78-92e9417de5c7 in datapath ee0e8153-ce03-4d00-ba76-00ea43cfdc08 unbound from our chassis
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.905 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <name>instance-00000003</name>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <uuid>8a96324d-81f3-42dd-9974-a49392009d7f</uuid>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:15</nova:creationTime>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:port uuid="a01679b1-1507-45f5-ab78-92e9417de5c7">
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <resource>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </resource>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <system>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='serial'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='uuid'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </system>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <os>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </os>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <features>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </features>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk' index='2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config' index='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:42:ef:4d'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target dev='tap81142ca2-75'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       </target>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </console>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <video>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </video>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c349,c991</label>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c349,c991</imagelabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </domain>
Nov 24 01:59:16 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.906 187003 INFO nova.virt.libvirt.driver [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully detached device tapa01679b1-15 from instance 8a96324d-81f3-42dd-9974-a49392009d7f from the live domain config.
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.906 187003 DEBUG nova.virt.libvirt.vif [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.907 187003 DEBUG nova.network.os_vif_util [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.907 187003 DEBUG nova.network.os_vif_util [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.908 187003 DEBUG os_vif [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 01:59:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:16.907 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee0e8153-ce03-4d00-ba76-00ea43cfdc08, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.909 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.910 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa01679b1-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:16.908 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0a937032-959d-43ee-a13a-4a3b1ac7258e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:16.909 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08 namespace which is not needed anymore
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.911 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.914 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.916 187003 INFO os_vif [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15')
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.917 187003 DEBUG nova.virt.libvirt.guest [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:16</nova:creationTime>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:16 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:16 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:16 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:16 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:16 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 01:59:16 compute-0 nova_compute[186999]: 2025-11-24 01:59:16.918 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:17 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [NOTICE]   (214652) : haproxy version is 2.8.14-c23fe91
Nov 24 01:59:17 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [NOTICE]   (214652) : path to executable is /usr/sbin/haproxy
Nov 24 01:59:17 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [WARNING]  (214652) : Exiting Master process...
Nov 24 01:59:17 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [ALERT]    (214652) : Current worker (214654) exited with code 143 (Terminated)
Nov 24 01:59:17 compute-0 neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08[214646]: [WARNING]  (214652) : All workers exited. Exiting... (0)
Nov 24 01:59:17 compute-0 systemd[1]: libpod-86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb.scope: Deactivated successfully.
Nov 24 01:59:17 compute-0 podman[214685]: 2025-11-24 01:59:17.065495775 +0000 UTC m=+0.049086167 container died 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 24 01:59:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb-userdata-shm.mount: Deactivated successfully.
Nov 24 01:59:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a86960dcf5f4494c23ab6334f2c2e2b73fd6350bcdd6e9420463c735303c7c1-merged.mount: Deactivated successfully.
Nov 24 01:59:17 compute-0 podman[214685]: 2025-11-24 01:59:17.106631043 +0000 UTC m=+0.090221455 container cleanup 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:59:17 compute-0 systemd[1]: libpod-conmon-86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb.scope: Deactivated successfully.
Nov 24 01:59:17 compute-0 podman[214715]: 2025-11-24 01:59:17.17361018 +0000 UTC m=+0.041996273 container remove 86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.181 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb2c7b8-3c00-4af2-8f81-c67927c8189b]: (4, ('Mon Nov 24 01:59:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08 (86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb)\n86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb\nMon Nov 24 01:59:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08 (86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb)\n86a1676d0a373d6d03755719d3d685d9ff5b8098c18d749dd55ea6bac858a6cb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.184 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[af2c2d81-9a27-4a49-bcb2-45ee2195eb29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.186 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee0e8153-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:17 compute-0 kernel: tapee0e8153-c0: left promiscuous mode
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.189 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.201 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.205 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[526192c4-0c66-4098-91f6-1b499080d08b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.219 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9fbe4a-ff94-4d82-905f-1c5548d2e256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.221 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0306a1-2489-4357-9f9e-6431ed202963]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.239 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d01d51-25fa-4bec-9214-5ca0966cb935]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 302208, 'reachable_time': 18238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214730, 'error': None, 'target': 'ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.242 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee0e8153-ce03-4d00-ba76-00ea43cfdc08 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 01:59:17 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:17.242 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[e3329acc-53aa-4b01-bad3-30f25be44ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:17 compute-0 systemd[1]: run-netns-ovnmeta\x2dee0e8153\x2dce03\x2d4d00\x2dba76\x2d00ea43cfdc08.mount: Deactivated successfully.
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.660 187003 DEBUG nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.661 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.661 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.662 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.662 187003 DEBUG nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.662 187003 WARNING nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 for instance with vm_state active and task_state None.
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.663 187003 DEBUG nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-unplugged-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.663 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.663 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.664 187003 DEBUG oslo_concurrency.lockutils [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.664 187003 DEBUG nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-unplugged-a01679b1-1507-45f5-ab78-92e9417de5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.664 187003 WARNING nova.compute.manager [req-ba33c9cc-ddcb-45d8-bb69-e872206762fc req-41cc4dbb-6375-4baa-849a-ee8877551347 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-unplugged-a01679b1-1507-45f5-ab78-92e9417de5c7 for instance with vm_state active and task_state None.
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.907 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.908 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.908 187003 DEBUG nova.network.neutron [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.958 187003 DEBUG nova.compute.manager [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-deleted-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.959 187003 INFO nova.compute.manager [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Neutron deleted interface a01679b1-1507-45f5-ab78-92e9417de5c7; detaching it from the instance and deleting it from the info cache
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.959 187003 DEBUG nova.network.neutron [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:17 compute-0 nova_compute[186999]: 2025-11-24 01:59:17.980 187003 DEBUG nova.objects.instance [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lazy-loading 'system_metadata' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.022 187003 DEBUG nova.objects.instance [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lazy-loading 'flavor' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.068 187003 DEBUG nova.virt.libvirt.vif [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.069 187003 DEBUG nova.network.os_vif_util [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.071 187003 DEBUG nova.network.os_vif_util [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.073 187003 DEBUG nova.virt.libvirt.guest [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.077 187003 DEBUG nova.virt.libvirt.guest [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <name>instance-00000003</name>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <uuid>8a96324d-81f3-42dd-9974-a49392009d7f</uuid>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:16</nova:creationTime>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <resource>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </resource>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <system>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='serial'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='uuid'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </system>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <os>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </os>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <features>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </features>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk' index='2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config' index='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:42:ef:4d'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='tap81142ca2-75'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       </target>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </console>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <video>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </video>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c349,c991</label>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c349,c991</imagelabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]: </domain>
Nov 24 01:59:18 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.077 187003 DEBUG nova.virt.libvirt.guest [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.080 187003 DEBUG nova.virt.libvirt.guest [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2e:ef:43"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa01679b1-15"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <name>instance-00000003</name>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <uuid>8a96324d-81f3-42dd-9974-a49392009d7f</uuid>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:16</nova:creationTime>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <resource>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </resource>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <system>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='serial'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='uuid'>8a96324d-81f3-42dd-9974-a49392009d7f</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </system>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <os>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </os>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <features>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </features>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk' index='2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/disk.config' index='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </controller>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:42:ef:4d'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target dev='tap81142ca2-75'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       </target>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f/console.log' append='off'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </console>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </input>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </graphics>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <video>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </video>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c349,c991</label>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c349,c991</imagelabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 01:59:18 compute-0 nova_compute[186999]: </domain>
Nov 24 01:59:18 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.080 187003 WARNING nova.virt.libvirt.driver [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Detaching interface fa:16:3e:2e:ef:43 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapa01679b1-15' not found.
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.081 187003 DEBUG nova.virt.libvirt.vif [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.081 187003 DEBUG nova.network.os_vif_util [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converting VIF {"id": "a01679b1-1507-45f5-ab78-92e9417de5c7", "address": "fa:16:3e:2e:ef:43", "network": {"id": "ee0e8153-ce03-4d00-ba76-00ea43cfdc08", "bridge": "br-int", "label": "tempest-network-smoke--747940562", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa01679b1-15", "ovs_interfaceid": "a01679b1-1507-45f5-ab78-92e9417de5c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.082 187003 DEBUG nova.network.os_vif_util [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.082 187003 DEBUG os_vif [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.083 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.083 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa01679b1-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.084 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.086 187003 INFO os_vif [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ef:43,bridge_name='br-int',has_traffic_filtering=True,id=a01679b1-1507-45f5-ab78-92e9417de5c7,network=Network(ee0e8153-ce03-4d00-ba76-00ea43cfdc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa01679b1-15')
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.086 187003 DEBUG nova.virt.libvirt.guest [req-95d73401-3d54-4696-99ea-b37bada6422b req-2eb37a0e-9af4-463f-a28c-6b04874f7e8a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-407998524</nova:name>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 01:59:18</nova:creationTime>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     <nova:port uuid="81142ca2-757d-4009-a916-5629cc1bff67">
Nov 24 01:59:18 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 01:59:18 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 01:59:18 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 01:59:18 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 01:59:18 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 01:59:18 compute-0 ovn_controller[95380]: 2025-11-24T01:59:18Z|00059|binding|INFO|Releasing lport 309d406e-bca9-4d02-bf94-de7a1bfd7dea from this chassis (sb_readonly=0)
Nov 24 01:59:18 compute-0 nova_compute[186999]: 2025-11-24 01:59:18.705 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.042 187003 INFO nova.network.neutron [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Port a01679b1-1507-45f5-ab78-92e9417de5c7 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.043 187003 DEBUG nova.network.neutron [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.060 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.084 187003 DEBUG oslo_concurrency.lockutils [None req-48b82263-978d-48df-bf55-3742d3afd9a2 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-8a96324d-81f3-42dd-9974-a49392009d7f-a01679b1-1507-45f5-ab78-92e9417de5c7" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.742 187003 DEBUG nova.compute.manager [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.743 187003 DEBUG oslo_concurrency.lockutils [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.743 187003 DEBUG oslo_concurrency.lockutils [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.743 187003 DEBUG oslo_concurrency.lockutils [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.744 187003 DEBUG nova.compute.manager [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.744 187003 WARNING nova.compute.manager [req-7275b134-cea7-43f9-b250-aec5f9a87d13 req-db56dbe0-f8c7-4062-a051-2faba752ef64 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-plugged-a01679b1-1507-45f5-ab78-92e9417de5c7 for instance with vm_state active and task_state None.
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.922 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.922 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.923 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.924 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.924 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.926 187003 INFO nova.compute.manager [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Terminating instance
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.928 187003 DEBUG nova.compute.manager [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 01:59:19 compute-0 kernel: tap81142ca2-75 (unregistering): left promiscuous mode
Nov 24 01:59:19 compute-0 NetworkManager[55458]: <info>  [1763949559.9630] device (tap81142ca2-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.974 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:19 compute-0 ovn_controller[95380]: 2025-11-24T01:59:19Z|00060|binding|INFO|Releasing lport 81142ca2-757d-4009-a916-5629cc1bff67 from this chassis (sb_readonly=0)
Nov 24 01:59:19 compute-0 ovn_controller[95380]: 2025-11-24T01:59:19Z|00061|binding|INFO|Setting lport 81142ca2-757d-4009-a916-5629cc1bff67 down in Southbound
Nov 24 01:59:19 compute-0 ovn_controller[95380]: 2025-11-24T01:59:19Z|00062|binding|INFO|Removing iface tap81142ca2-75 ovn-installed in OVS
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.983 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:19 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:19.983 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:ef:4d 10.100.0.14'], port_security=['fa:16:3e:42:ef:4d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8a96324d-81f3-42dd-9974-a49392009d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '586ea056-5f50-47a9-ae5d-2f44abd0c7c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=627b2843-fb67-4184-b3db-30deb84eba89, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=81142ca2-757d-4009-a916-5629cc1bff67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:59:19 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:19.985 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 81142ca2-757d-4009-a916-5629cc1bff67 in datapath f2383360-95a5-4b5a-9aa4-a99b489f9cee unbound from our chassis
Nov 24 01:59:19 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:19.986 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2383360-95a5-4b5a-9aa4-a99b489f9cee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 01:59:19 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:19.987 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6a12e595-a513-4288-b477-b43bfd91ba04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:19 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:19.989 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee namespace which is not needed anymore
Nov 24 01:59:19 compute-0 nova_compute[186999]: 2025-11-24 01:59:19.994 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 24 01:59:20 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 13.188s CPU time.
Nov 24 01:59:20 compute-0 systemd-machined[153319]: Machine qemu-3-instance-00000003 terminated.
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.044 187003 DEBUG nova.compute.manager [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-changed-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.045 187003 DEBUG nova.compute.manager [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing instance network info cache due to event network-changed-81142ca2-757d-4009-a916-5629cc1bff67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.045 187003 DEBUG oslo_concurrency.lockutils [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.046 187003 DEBUG oslo_concurrency.lockutils [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.046 187003 DEBUG nova.network.neutron [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Refreshing network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:59:20 compute-0 podman[214731]: 2025-11-24 01:59:20.07967099 +0000 UTC m=+0.079918372 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 24 01:59:20 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [NOTICE]   (214375) : haproxy version is 2.8.14-c23fe91
Nov 24 01:59:20 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [NOTICE]   (214375) : path to executable is /usr/sbin/haproxy
Nov 24 01:59:20 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [WARNING]  (214375) : Exiting Master process...
Nov 24 01:59:20 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [ALERT]    (214375) : Current worker (214377) exited with code 143 (Terminated)
Nov 24 01:59:20 compute-0 neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee[214371]: [WARNING]  (214375) : All workers exited. Exiting... (0)
Nov 24 01:59:20 compute-0 systemd[1]: libpod-834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495.scope: Deactivated successfully.
Nov 24 01:59:20 compute-0 podman[214773]: 2025-11-24 01:59:20.145330891 +0000 UTC m=+0.049872339 container died 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 01:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495-userdata-shm.mount: Deactivated successfully.
Nov 24 01:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-d370607bb3a132adea369a949871949a3668679e9304ea0c38231c003f347fa4-merged.mount: Deactivated successfully.
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.188 187003 INFO nova.virt.libvirt.driver [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance destroyed successfully.
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.188 187003 DEBUG nova.objects.instance [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 8a96324d-81f3-42dd-9974-a49392009d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:20 compute-0 podman[214773]: 2025-11-24 01:59:20.196061172 +0000 UTC m=+0.100602560 container cleanup 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.198 187003 DEBUG nova.virt.libvirt.vif [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:58:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-407998524',display_name='tempest-TestNetworkBasicOps-server-407998524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-407998524',id=3,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXXcvc2ehGtn28cKffsKJdjxG18MJNd2Yf9A+vQbbTF+kz0VJlzdUMqUtDR1bXi9JinKhJV18OQxjU5Yxkk82bdeZZCKUD8hEeWtp+wgBXiAo0k9cjbhTxWlaVp99npjw==',key_name='tempest-TestNetworkBasicOps-1442091332',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:58:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-xidy5unf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:58:43Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=8a96324d-81f3-42dd-9974-a49392009d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.199 187003 DEBUG nova.network.os_vif_util [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.200 187003 DEBUG nova.network.os_vif_util [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.200 187003 DEBUG os_vif [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.201 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.202 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81142ca2-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.203 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 systemd[1]: libpod-conmon-834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495.scope: Deactivated successfully.
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.206 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.208 187003 INFO os_vif [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:ef:4d,bridge_name='br-int',has_traffic_filtering=True,id=81142ca2-757d-4009-a916-5629cc1bff67,network=Network(f2383360-95a5-4b5a-9aa4-a99b489f9cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81142ca2-75')
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.208 187003 INFO nova.virt.libvirt.driver [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Deleting instance files /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f_del
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.209 187003 INFO nova.virt.libvirt.driver [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Deletion of /var/lib/nova/instances/8a96324d-81f3-42dd-9974-a49392009d7f_del complete
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.247 187003 INFO nova.compute.manager [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Took 0.32 seconds to destroy the instance on the hypervisor.
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.248 187003 DEBUG oslo.service.loopingcall [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.248 187003 DEBUG nova.compute.manager [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.248 187003 DEBUG nova.network.neutron [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 01:59:20 compute-0 podman[214818]: 2025-11-24 01:59:20.272552909 +0000 UTC m=+0.050730272 container remove 834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.277 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[29df2c6d-c680-4fdf-b65c-b407b0ebc361]: (4, ('Mon Nov 24 01:59:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee (834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495)\n834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495\nMon Nov 24 01:59:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee (834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495)\n834cba6820da432e9d6330561d7125d78bebeb8a2755218ac92896cb5f1a2495\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.279 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[fba4f51c-4e28-4f75-986d-f0e5720e35a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.280 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2383360-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.281 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 kernel: tapf2383360-90: left promiscuous mode
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.297 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 nova_compute[186999]: 2025-11-24 01:59:20.299 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.300 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[709b8a01-0200-42b5-9c41-14de1c3133d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.320 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[5ece9f75-0344-409f-98cf-babaab8ba8ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.322 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9dfcd9bd-9666-46ce-b455-370f71437da0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.341 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[94b732b7-5274-4458-a27f-afce51ed1122]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 298972, 'reachable_time': 39623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214831, 'error': None, 'target': 'ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.343 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2383360-95a5-4b5a-9aa4-a99b489f9cee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 01:59:20 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:20.343 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9024ed-8888-46aa-8532-8d621c56e2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:20 compute-0 systemd[1]: run-netns-ovnmeta\x2df2383360\x2d95a5\x2d4b5a\x2d9aa4\x2da99b489f9cee.mount: Deactivated successfully.
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.436 187003 DEBUG nova.network.neutron [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.459 187003 INFO nova.compute.manager [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Took 1.21 seconds to deallocate network for instance.
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.462 187003 DEBUG nova.network.neutron [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updated VIF entry in instance network info cache for port 81142ca2-757d-4009-a916-5629cc1bff67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.462 187003 DEBUG nova.network.neutron [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Updating instance_info_cache with network_info: [{"id": "81142ca2-757d-4009-a916-5629cc1bff67", "address": "fa:16:3e:42:ef:4d", "network": {"id": "f2383360-95a5-4b5a-9aa4-a99b489f9cee", "bridge": "br-int", "label": "tempest-network-smoke--335341168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81142ca2-75", "ovs_interfaceid": "81142ca2-757d-4009-a916-5629cc1bff67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.492 187003 DEBUG oslo_concurrency.lockutils [req-79acf810-6199-4950-bbc6-090baa334f44 req-e8f132c7-b2db-4a76-b64b-91fce8ded0d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-8a96324d-81f3-42dd-9974-a49392009d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.499 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.499 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.852 187003 DEBUG nova.compute.provider_tree [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.867 187003 DEBUG nova.scheduler.client.report [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.885 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.904 187003 INFO nova.scheduler.client.report [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 8a96324d-81f3-42dd-9974-a49392009d7f
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.918 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:21 compute-0 nova_compute[186999]: 2025-11-24 01:59:21.956 187003 DEBUG oslo_concurrency.lockutils [None req-844e5191-f6c9-4dc9-ba71-f6f3f0294363 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.102 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-unplugged-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.103 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.103 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.103 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.103 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-unplugged-81142ca2-757d-4009-a916-5629cc1bff67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.104 187003 WARNING nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-unplugged-81142ca2-757d-4009-a916-5629cc1bff67 for instance with vm_state deleted and task_state None.
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.104 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.104 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.104 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.105 187003 DEBUG oslo_concurrency.lockutils [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "8a96324d-81f3-42dd-9974-a49392009d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.105 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] No waiting events found dispatching network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.105 187003 WARNING nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received unexpected event network-vif-plugged-81142ca2-757d-4009-a916-5629cc1bff67 for instance with vm_state deleted and task_state None.
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.105 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Received event network-vif-deleted-81142ca2-757d-4009-a916-5629cc1bff67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.106 187003 INFO nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Neutron deleted interface 81142ca2-757d-4009-a916-5629cc1bff67; detaching it from the instance and deleting it from the info cache
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.106 187003 DEBUG nova.network.neutron [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 24 01:59:22 compute-0 nova_compute[186999]: 2025-11-24 01:59:22.108 187003 DEBUG nova.compute.manager [req-b6cc8786-fa05-4c30-983b-5b3af029d5a7 req-6b12d22d-d343-411a-84de-e302e3dd5e6f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Detach interface failed, port_id=81142ca2-757d-4009-a916-5629cc1bff67, reason: Instance 8a96324d-81f3-42dd-9974-a49392009d7f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 24 01:59:23 compute-0 podman[214832]: 2025-11-24 01:59:23.857338732 +0000 UTC m=+0.091372916 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Nov 24 01:59:25 compute-0 nova_compute[186999]: 2025-11-24 01:59:25.175 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:25 compute-0 nova_compute[186999]: 2025-11-24 01:59:25.203 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:25 compute-0 nova_compute[186999]: 2025-11-24 01:59:25.217 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:26 compute-0 nova_compute[186999]: 2025-11-24 01:59:26.920 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:30 compute-0 nova_compute[186999]: 2025-11-24 01:59:30.206 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:30 compute-0 podman[214856]: 2025-11-24 01:59:30.807280795 +0000 UTC m=+0.053306602 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 01:59:31 compute-0 nova_compute[186999]: 2025-11-24 01:59:31.973 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:34 compute-0 podman[214881]: 2025-11-24 01:59:34.80712734 +0000 UTC m=+0.056533461 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 24 01:59:35 compute-0 nova_compute[186999]: 2025-11-24 01:59:35.187 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949560.1861744, 8a96324d-81f3-42dd-9974-a49392009d7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:59:35 compute-0 nova_compute[186999]: 2025-11-24 01:59:35.188 187003 INFO nova.compute.manager [-] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] VM Stopped (Lifecycle Event)
Nov 24 01:59:35 compute-0 nova_compute[186999]: 2025-11-24 01:59:35.208 187003 DEBUG nova.compute.manager [None req-3f14b9d0-9ac6-4028-8663-bf6d2d674143 - - - - - -] [instance: 8a96324d-81f3-42dd-9974-a49392009d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:59:35 compute-0 nova_compute[186999]: 2025-11-24 01:59:35.238 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:36 compute-0 nova_compute[186999]: 2025-11-24 01:59:36.975 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:39 compute-0 podman[214900]: 2025-11-24 01:59:39.851990101 +0000 UTC m=+0.093504905 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 24 01:59:40 compute-0 nova_compute[186999]: 2025-11-24 01:59:40.240 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:41 compute-0 podman[214920]: 2025-11-24 01:59:41.826666321 +0000 UTC m=+0.074026911 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 01:59:41 compute-0 podman[214921]: 2025-11-24 01:59:41.880174698 +0000 UTC m=+0.122464549 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 24 01:59:42 compute-0 nova_compute[186999]: 2025-11-24 01:59:42.023 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.865 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.865 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.878 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.962 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.963 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.972 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 01:59:44 compute-0 nova_compute[186999]: 2025-11-24 01:59:44.972 187003 INFO nova.compute.claims [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Claim successful on node compute-0.ctlplane.example.com
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.243 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.325 187003 DEBUG nova.compute.provider_tree [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.349 187003 DEBUG nova.scheduler.client.report [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.368 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.369 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.475 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.475 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.498 187003 INFO nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.530 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.760 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.761 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.761 187003 INFO nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Creating image(s)
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.762 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.762 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.763 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.777 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.840 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.841 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.842 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.857 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.909 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.910 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.937 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.939 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.939 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.992 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.993 187003 DEBUG nova.virt.disk.api [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 01:59:45 compute-0 nova_compute[186999]: 2025-11-24 01:59:45.993 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.064 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.065 187003 DEBUG nova.virt.disk.api [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.066 187003 DEBUG nova.objects.instance [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 80db970f-9856-4c4d-9dff-715dafb6a925 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.076 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.076 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Ensure instance console log exists: /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.076 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.077 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.077 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:46 compute-0 nova_compute[186999]: 2025-11-24 01:59:46.370 187003 DEBUG nova.policy [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 01:59:47 compute-0 nova_compute[186999]: 2025-11-24 01:59:47.024 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:47 compute-0 nova_compute[186999]: 2025-11-24 01:59:47.573 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:47 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:47.573 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:59:47 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:47.574 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 01:59:47 compute-0 nova_compute[186999]: 2025-11-24 01:59:47.725 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Successfully created port: cacdcf87-dd20-4393-8ddb-3e196f1bec94 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 01:59:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:48.420 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:48.420 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:48.420 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:48 compute-0 nova_compute[186999]: 2025-11-24 01:59:48.936 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Successfully updated port: cacdcf87-dd20-4393-8ddb-3e196f1bec94 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 01:59:48 compute-0 nova_compute[186999]: 2025-11-24 01:59:48.949 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:48 compute-0 nova_compute[186999]: 2025-11-24 01:59:48.949 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:48 compute-0 nova_compute[186999]: 2025-11-24 01:59:48.949 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 01:59:49 compute-0 nova_compute[186999]: 2025-11-24 01:59:49.281 187003 DEBUG nova.compute.manager [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:49 compute-0 nova_compute[186999]: 2025-11-24 01:59:49.281 187003 DEBUG nova.compute.manager [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing instance network info cache due to event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:59:49 compute-0 nova_compute[186999]: 2025-11-24 01:59:49.282 187003 DEBUG oslo_concurrency.lockutils [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:49 compute-0 nova_compute[186999]: 2025-11-24 01:59:49.588 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.246 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:50 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:50.576 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.719 187003 DEBUG nova.network.neutron [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.733 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.734 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Instance network_info: |[{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.735 187003 DEBUG oslo_concurrency.lockutils [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.735 187003 DEBUG nova.network.neutron [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.741 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Start _get_guest_xml network_info=[{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.749 187003 WARNING nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.761 187003 DEBUG nova.virt.libvirt.host [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.762 187003 DEBUG nova.virt.libvirt.host [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.766 187003 DEBUG nova.virt.libvirt.host [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.767 187003 DEBUG nova.virt.libvirt.host [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.768 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.768 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.769 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.770 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.770 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.771 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.771 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.772 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.772 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.772 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.773 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.773 187003 DEBUG nova.virt.hardware [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.779 187003 DEBUG nova.virt.libvirt.vif [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:59:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254954222',display_name='tempest-TestNetworkBasicOps-server-1254954222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254954222',id=4,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDkXTipxsu/cfKZkE2eA0ypez4hb7rz642IdxeUa3rRPrHPGPnY5t1ePwbUwP8GFtOsbs0xgrHWovHTVxfpoOWKLNHCgpC8dIF9IGf7s2bWpbN1Ejn3j98rbyN/+l5kDnA==',key_name='tempest-TestNetworkBasicOps-1290362523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-v1qedma7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:59:45Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=80db970f-9856-4c4d-9dff-715dafb6a925,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.780 187003 DEBUG nova.network.os_vif_util [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.781 187003 DEBUG nova.network.os_vif_util [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.782 187003 DEBUG nova.objects.instance [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 80db970f-9856-4c4d-9dff-715dafb6a925 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.795 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] End _get_guest_xml xml=<domain type="kvm">
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <uuid>80db970f-9856-4c4d-9dff-715dafb6a925</uuid>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <name>instance-00000004</name>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <metadata>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-1254954222</nova:name>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 01:59:50</nova:creationTime>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         <nova:port uuid="cacdcf87-dd20-4393-8ddb-3e196f1bec94">
Nov 24 01:59:50 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </metadata>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <system>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="serial">80db970f-9856-4c4d-9dff-715dafb6a925</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="uuid">80db970f-9856-4c4d-9dff-715dafb6a925</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </system>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <os>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </os>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <features>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <apic/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </features>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </clock>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </cpu>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   <devices>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.config"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </disk>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:77:04:71"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <target dev="tapcacdcf87-dd"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </interface>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/console.log" append="off"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </serial>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <video>
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </video>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </rng>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 01:59:50 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 01:59:50 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 01:59:50 compute-0 nova_compute[186999]:   </devices>
Nov 24 01:59:50 compute-0 nova_compute[186999]: </domain>
Nov 24 01:59:50 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.797 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Preparing to wait for external event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.798 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.798 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.799 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.800 187003 DEBUG nova.virt.libvirt.vif [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T01:59:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254954222',display_name='tempest-TestNetworkBasicOps-server-1254954222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254954222',id=4,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDkXTipxsu/cfKZkE2eA0ypez4hb7rz642IdxeUa3rRPrHPGPnY5t1ePwbUwP8GFtOsbs0xgrHWovHTVxfpoOWKLNHCgpC8dIF9IGf7s2bWpbN1Ejn3j98rbyN/+l5kDnA==',key_name='tempest-TestNetworkBasicOps-1290362523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-v1qedma7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T01:59:45Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=80db970f-9856-4c4d-9dff-715dafb6a925,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.801 187003 DEBUG nova.network.os_vif_util [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.802 187003 DEBUG nova.network.os_vif_util [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.803 187003 DEBUG os_vif [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.804 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.804 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.805 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.808 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.809 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcacdcf87-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.810 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcacdcf87-dd, col_values=(('external_ids', {'iface-id': 'cacdcf87-dd20-4393-8ddb-3e196f1bec94', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:04:71', 'vm-uuid': '80db970f-9856-4c4d-9dff-715dafb6a925'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.812 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:50 compute-0 NetworkManager[55458]: <info>  [1763949590.8134] manager: (tapcacdcf87-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.814 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.819 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.820 187003 INFO os_vif [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd')
Nov 24 01:59:50 compute-0 podman[214987]: 2025-11-24 01:59:50.84122721 +0000 UTC m=+0.086483171 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.878 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.879 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.879 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:77:04:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 01:59:50 compute-0 nova_compute[186999]: 2025-11-24 01:59:50.880 187003 INFO nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Using config drive
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.141 187003 INFO nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Creating config drive at /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.config
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.145 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjxouunfg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.274 187003 DEBUG oslo_concurrency.processutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjxouunfg" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 01:59:51 compute-0 kernel: tapcacdcf87-dd: entered promiscuous mode
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.3306] manager: (tapcacdcf87-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.331 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_controller[95380]: 2025-11-24T01:59:51Z|00063|binding|INFO|Claiming lport cacdcf87-dd20-4393-8ddb-3e196f1bec94 for this chassis.
Nov 24 01:59:51 compute-0 ovn_controller[95380]: 2025-11-24T01:59:51Z|00064|binding|INFO|cacdcf87-dd20-4393-8ddb-3e196f1bec94: Claiming fa:16:3e:77:04:71 10.100.0.10
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.335 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.347 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:04:71 10.100.0.10'], port_security=['fa:16:3e:77:04:71 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '80db970f-9856-4c4d-9dff-715dafb6a925', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b2607ce-da81-4f0e-9324-8381381e0e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c169b27d-da28-49aa-b924-137f9275f851', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84b01f9-1343-4122-aba2-6c3c243d2f99, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=cacdcf87-dd20-4393-8ddb-3e196f1bec94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.348 104238 INFO neutron.agent.ovn.metadata.agent [-] Port cacdcf87-dd20-4393-8ddb-3e196f1bec94 in datapath 5b2607ce-da81-4f0e-9324-8381381e0e39 bound to our chassis
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.349 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b2607ce-da81-4f0e-9324-8381381e0e39
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.364 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0c696057-f0c1-49f1-961a-b1c43e282b22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.365 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5b2607ce-d1 in ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.367 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5b2607ce-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.368 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6af02be5-9655-465b-b3cd-6c662aca7bdb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.368 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e0eec6d8-7018-4026-b501-5dcb06e577f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 systemd-udevd[215026]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.3862] device (tapcacdcf87-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.3872] device (tapcacdcf87-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.390 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[0442417c-24dc-4963-b4aa-282008aebadb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 systemd-machined[153319]: New machine qemu-4-instance-00000004.
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.421 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_controller[95380]: 2025-11-24T01:59:51Z|00065|binding|INFO|Setting lport cacdcf87-dd20-4393-8ddb-3e196f1bec94 ovn-installed in OVS
Nov 24 01:59:51 compute-0 ovn_controller[95380]: 2025-11-24T01:59:51Z|00066|binding|INFO|Setting lport cacdcf87-dd20-4393-8ddb-3e196f1bec94 up in Southbound
Nov 24 01:59:51 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.428 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.429 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e5248c9a-e44e-4cfd-9d7a-b67cd53cf3f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.453 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[efd17ee5-fcc9-415a-81e9-2da9103cc400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 systemd-udevd[215030]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.4619] manager: (tap5b2607ce-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.461 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[71320a8e-28a6-427f-98a3-b04a73162bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.495 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[45414634-5e3b-4854-850e-cc8fb069d9f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.497 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[165b7493-475a-4abd-be3b-200ddd2698de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.5236] device (tap5b2607ce-d0): carrier: link connected
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.528 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[4cefba47-c783-46cb-985c-c31582ccecc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.547 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[314f9e5d-fac8-40a2-8b3b-8bdbc26e5ce9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b2607ce-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:3d:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 305818, 'reachable_time': 26892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215060, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.566 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[289b0535-b38a-4f49-b93f-bf1c0b05dd36]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:3dfd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 305818, 'tstamp': 305818}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215061, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.581 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f73e92-df60-4ce6-a17c-8060664df2f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b2607ce-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:3d:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 305818, 'reachable_time': 26892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215062, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.616 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[86dcb7d4-82ab-4a4f-85b5-29304149226a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.676 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ee5676-e91c-4207-90d3-aa026bc948e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.680 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b2607ce-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.680 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.681 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b2607ce-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:51 compute-0 NetworkManager[55458]: <info>  [1763949591.6837] manager: (tap5b2607ce-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 24 01:59:51 compute-0 kernel: tap5b2607ce-d0: entered promiscuous mode
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.683 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.686 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b2607ce-d0, col_values=(('external_ids', {'iface-id': '1d1ae020-8e04-4ae9-898c-f460d6d5fb29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.687 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_controller[95380]: 2025-11-24T01:59:51Z|00067|binding|INFO|Releasing lport 1d1ae020-8e04-4ae9-898c-f460d6d5fb29 from this chassis (sb_readonly=0)
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.710 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.711 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5b2607ce-da81-4f0e-9324-8381381e0e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5b2607ce-da81-4f0e-9324-8381381e0e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.712 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[07940590-e95c-4dc9-b458-006bfa1a27e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.716 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: global
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-5b2607ce-da81-4f0e-9324-8381381e0e39
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/5b2607ce-da81-4f0e-9324-8381381e0e39.pid.haproxy
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 5b2607ce-da81-4f0e-9324-8381381e0e39
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 01:59:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 01:59:51.717 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'env', 'PROCESS_TAG=haproxy-5b2607ce-da81-4f0e-9324-8381381e0e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5b2607ce-da81-4f0e-9324-8381381e0e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.772 187003 DEBUG nova.compute.manager [req-d325555d-a905-420e-aa57-0b0364ae95b9 req-62f624bd-c229-4e0c-8b65-693e9de72c3f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.773 187003 DEBUG oslo_concurrency.lockutils [req-d325555d-a905-420e-aa57-0b0364ae95b9 req-62f624bd-c229-4e0c-8b65-693e9de72c3f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.773 187003 DEBUG oslo_concurrency.lockutils [req-d325555d-a905-420e-aa57-0b0364ae95b9 req-62f624bd-c229-4e0c-8b65-693e9de72c3f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.773 187003 DEBUG oslo_concurrency.lockutils [req-d325555d-a905-420e-aa57-0b0364ae95b9 req-62f624bd-c229-4e0c-8b65-693e9de72c3f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:51 compute-0 nova_compute[186999]: 2025-11-24 01:59:51.774 187003 DEBUG nova.compute.manager [req-d325555d-a905-420e-aa57-0b0364ae95b9 req-62f624bd-c229-4e0c-8b65-693e9de72c3f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Processing event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.026 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:52 compute-0 podman[215098]: 2025-11-24 01:59:52.062986635 +0000 UTC m=+0.048306383 container create 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.076 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.077 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949592.0759578, 80db970f-9856-4c4d-9dff-715dafb6a925 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.077 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] VM Started (Lifecycle Event)
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.079 187003 DEBUG nova.network.neutron [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updated VIF entry in instance network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.080 187003 DEBUG nova.network.neutron [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.081 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.085 187003 INFO nova.virt.libvirt.driver [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Instance spawned successfully.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.085 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 01:59:52 compute-0 systemd[1]: Started libpod-conmon-4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684.scope.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.119 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.123 187003 DEBUG oslo_concurrency.lockutils [req-27b15a90-65c4-4ca1-865d-297512ecc4ce req-ce75c75f-3dcb-4d7a-bff8-0f6dee72b929 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.124 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.131 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.131 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.132 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.132 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.132 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.133 187003 DEBUG nova.virt.libvirt.driver [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 01:59:52 compute-0 podman[215098]: 2025-11-24 01:59:52.038553951 +0000 UTC m=+0.023873729 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.138 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.138 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949592.0761693, 80db970f-9856-4c4d-9dff-715dafb6a925 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.138 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] VM Paused (Lifecycle Event)
Nov 24 01:59:52 compute-0 systemd[1]: Started libcrun container.
Nov 24 01:59:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d96b22ecfded2be3916f76e925867669ac0016308df7bafae711afeba87fe76/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.159 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.162 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949592.0799797, 80db970f-9856-4c4d-9dff-715dafb6a925 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.162 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] VM Resumed (Lifecycle Event)
Nov 24 01:59:52 compute-0 podman[215098]: 2025-11-24 01:59:52.168728082 +0000 UTC m=+0.154047850 container init 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 01:59:52 compute-0 podman[215098]: 2025-11-24 01:59:52.178396663 +0000 UTC m=+0.163716411 container start 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.180 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.188 187003 INFO nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Took 6.43 seconds to spawn the instance on the hypervisor.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.189 187003 DEBUG nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.190 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 01:59:52 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [NOTICE]   (215118) : New worker (215120) forked
Nov 24 01:59:52 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [NOTICE]   (215118) : Loading success.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.218 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.250 187003 INFO nova.compute.manager [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Took 7.33 seconds to build instance.
Nov 24 01:59:52 compute-0 nova_compute[186999]: 2025-11-24 01:59:52.270 187003 DEBUG oslo_concurrency.lockutils [None req-ebe8888b-add9-4d12-a9e0-cab4d0248dd3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.833 187003 DEBUG nova.compute.manager [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.834 187003 DEBUG oslo_concurrency.lockutils [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.835 187003 DEBUG oslo_concurrency.lockutils [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.835 187003 DEBUG oslo_concurrency.lockutils [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.836 187003 DEBUG nova.compute.manager [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] No waiting events found dispatching network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 01:59:53 compute-0 nova_compute[186999]: 2025-11-24 01:59:53.836 187003 WARNING nova.compute.manager [req-c33dd27d-8c04-496a-95bd-e0f4b326b756 req-39096b6f-bffa-45ea-9d2f-057582cf92d6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received unexpected event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 for instance with vm_state active and task_state None.
Nov 24 01:59:54 compute-0 podman[215129]: 2025-11-24 01:59:54.803597955 +0000 UTC m=+0.060137664 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public)
Nov 24 01:59:55 compute-0 ovn_controller[95380]: 2025-11-24T01:59:55Z|00068|binding|INFO|Releasing lport 1d1ae020-8e04-4ae9-898c-f460d6d5fb29 from this chassis (sb_readonly=0)
Nov 24 01:59:55 compute-0 NetworkManager[55458]: <info>  [1763949595.4089] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 24 01:59:55 compute-0 NetworkManager[55458]: <info>  [1763949595.4107] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.407 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:55 compute-0 ovn_controller[95380]: 2025-11-24T01:59:55Z|00069|binding|INFO|Releasing lport 1d1ae020-8e04-4ae9-898c-f460d6d5fb29 from this chassis (sb_readonly=0)
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.434 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.442 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.684 187003 DEBUG nova.compute.manager [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.685 187003 DEBUG nova.compute.manager [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing instance network info cache due to event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.685 187003 DEBUG oslo_concurrency.lockutils [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.685 187003 DEBUG oslo_concurrency.lockutils [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.685 187003 DEBUG nova.network.neutron [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 01:59:55 compute-0 nova_compute[186999]: 2025-11-24 01:59:55.812 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:56 compute-0 sshd-session[215151]: Invalid user admin1234 from 154.90.59.75 port 53258
Nov 24 01:59:57 compute-0 nova_compute[186999]: 2025-11-24 01:59:57.027 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 01:59:57 compute-0 sshd-session[215151]: Received disconnect from 154.90.59.75 port 53258:11: Bye Bye [preauth]
Nov 24 01:59:57 compute-0 sshd-session[215151]: Disconnected from invalid user admin1234 154.90.59.75 port 53258 [preauth]
Nov 24 01:59:57 compute-0 nova_compute[186999]: 2025-11-24 01:59:57.717 187003 DEBUG nova.network.neutron [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updated VIF entry in instance network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 01:59:57 compute-0 nova_compute[186999]: 2025-11-24 01:59:57.718 187003 DEBUG nova.network.neutron [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 01:59:57 compute-0 nova_compute[186999]: 2025-11-24 01:59:57.753 187003 DEBUG oslo_concurrency.lockutils [req-3d27a123-31db-43cf-8eb0-1129303f3b60 req-60c9c859-b471-4935-a471-8eb5bbe529e1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:00 compute-0 nova_compute[186999]: 2025-11-24 02:00:00.815 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:00 compute-0 podman[215153]: 2025-11-24 02:00:00.921455663 +0000 UTC m=+0.069246768 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:00:02 compute-0 nova_compute[186999]: 2025-11-24 02:00:02.029 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:03 compute-0 ovn_controller[95380]: 2025-11-24T02:00:03Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:04:71 10.100.0.10
Nov 24 02:00:03 compute-0 ovn_controller[95380]: 2025-11-24T02:00:03Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:04:71 10.100.0.10
Nov 24 02:00:05 compute-0 podman[215190]: 2025-11-24 02:00:05.797224167 +0000 UTC m=+0.047100878 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:00:05 compute-0 nova_compute[186999]: 2025-11-24 02:00:05.818 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:06 compute-0 nova_compute[186999]: 2025-11-24 02:00:06.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:07 compute-0 nova_compute[186999]: 2025-11-24 02:00:07.030 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:09 compute-0 nova_compute[186999]: 2025-11-24 02:00:09.766 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:09 compute-0 nova_compute[186999]: 2025-11-24 02:00:09.785 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:09 compute-0 nova_compute[186999]: 2025-11-24 02:00:09.785 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:00:09 compute-0 nova_compute[186999]: 2025-11-24 02:00:09.785 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.084 187003 INFO nova.compute.manager [None req-8e912990-4feb-4436-9fe5-7faa236883f6 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Get console output
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.105 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.606 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.606 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.607 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.607 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 80db970f-9856-4c4d-9dff-715dafb6a925 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:00:10 compute-0 podman[215209]: 2025-11-24 02:00:10.802669478 +0000 UTC m=+0.060677009 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 24 02:00:10 compute-0 nova_compute[186999]: 2025-11-24 02:00:10.858 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.073 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.641 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.657 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.658 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.658 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.659 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.659 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.660 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.660 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.685 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.685 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.686 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.686 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.774 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:12 compute-0 podman[215229]: 2025-11-24 02:00:12.811030735 +0000 UTC m=+0.067473258 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:00:12 compute-0 podman[215231]: 2025-11-24 02:00:12.844154952 +0000 UTC m=+0.104343700 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.848 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.849 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:12 compute-0 nova_compute[186999]: 2025-11-24 02:00:12.906 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.003 187003 DEBUG nova.compute.manager [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.003 187003 DEBUG nova.compute.manager [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing instance network info cache due to event network-changed-cacdcf87-dd20-4393-8ddb-3e196f1bec94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.004 187003 DEBUG oslo_concurrency.lockutils [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.004 187003 DEBUG oslo_concurrency.lockutils [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.004 187003 DEBUG nova.network.neutron [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Refreshing network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.091 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.092 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5610MB free_disk=73.43325805664062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.092 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.092 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.153 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 80db970f-9856-4c4d-9dff-715dafb6a925 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.153 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.153 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.194 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.206 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.222 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:00:13 compute-0 nova_compute[186999]: 2025-11-24 02:00:13.222 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:14 compute-0 nova_compute[186999]: 2025-11-24 02:00:14.807 187003 DEBUG nova.network.neutron [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updated VIF entry in instance network info cache for port cacdcf87-dd20-4393-8ddb-3e196f1bec94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:00:14 compute-0 nova_compute[186999]: 2025-11-24 02:00:14.808 187003 DEBUG nova.network.neutron [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [{"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:14 compute-0 nova_compute[186999]: 2025-11-24 02:00:14.828 187003 DEBUG oslo_concurrency.lockutils [req-b596cd0e-e281-4b33-9711-e4e783db8f65 req-ad30e007-38e7-4610-bdec-e4fee437fd6c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-80db970f-9856-4c4d-9dff-715dafb6a925" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:15 compute-0 nova_compute[186999]: 2025-11-24 02:00:15.222 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:15 compute-0 nova_compute[186999]: 2025-11-24 02:00:15.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:00:15 compute-0 nova_compute[186999]: 2025-11-24 02:00:15.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:00:15 compute-0 nova_compute[186999]: 2025-11-24 02:00:15.861 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:17 compute-0 nova_compute[186999]: 2025-11-24 02:00:17.074 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:18 compute-0 nova_compute[186999]: 2025-11-24 02:00:18.913 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:18 compute-0 nova_compute[186999]: 2025-11-24 02:00:18.914 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:18 compute-0 nova_compute[186999]: 2025-11-24 02:00:18.931 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.017 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.017 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.024 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.024 187003 INFO nova.compute.claims [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.167 187003 DEBUG nova.compute.provider_tree [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.178 187003 DEBUG nova.scheduler.client.report [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.195 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.196 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.235 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.236 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.251 187003 INFO nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.267 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.360 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.361 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.361 187003 INFO nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Creating image(s)
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.362 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.362 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.363 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.373 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.459 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.460 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.461 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.473 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.562 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.564 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.607 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.609 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.610 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.632 187003 DEBUG nova.policy [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.680 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.681 187003 DEBUG nova.virt.disk.api [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.681 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.735 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.736 187003 DEBUG nova.virt.disk.api [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.737 187003 DEBUG nova.objects.instance [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.748 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.748 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Ensure instance console log exists: /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.749 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.749 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:19 compute-0 nova_compute[186999]: 2025-11-24 02:00:19.750 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:20 compute-0 nova_compute[186999]: 2025-11-24 02:00:20.312 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Successfully created port: de7f9d8a-48a4-4730-a774-102190ae66ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:00:20 compute-0 nova_compute[186999]: 2025-11-24 02:00:20.864 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.131 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Successfully updated port: de7f9d8a-48a4-4730-a774-102190ae66ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.142 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.143 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.143 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.208 187003 DEBUG nova.compute.manager [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-changed-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.209 187003 DEBUG nova.compute.manager [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Refreshing instance network info cache due to event network-changed-de7f9d8a-48a4-4730-a774-102190ae66ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.210 187003 DEBUG oslo_concurrency.lockutils [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:00:21 compute-0 nova_compute[186999]: 2025-11-24 02:00:21.287 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:00:21 compute-0 podman[215303]: 2025-11-24 02:00:21.805844117 +0000 UTC m=+0.062509449 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.079 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.164 187003 DEBUG nova.network.neutron [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updating instance_info_cache with network_info: [{"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.187 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.187 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Instance network_info: |[{"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.188 187003 DEBUG oslo_concurrency.lockutils [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.188 187003 DEBUG nova.network.neutron [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Refreshing network info cache for port de7f9d8a-48a4-4730-a774-102190ae66ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.192 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Start _get_guest_xml network_info=[{"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.196 187003 WARNING nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.201 187003 DEBUG nova.virt.libvirt.host [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.202 187003 DEBUG nova.virt.libvirt.host [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.205 187003 DEBUG nova.virt.libvirt.host [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.206 187003 DEBUG nova.virt.libvirt.host [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.206 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.206 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.207 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.207 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.207 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.208 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.208 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.208 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.208 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.209 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.209 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.209 187003 DEBUG nova.virt.hardware [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.213 187003 DEBUG nova.virt.libvirt.vif [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:00:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-566499225',display_name='tempest-TestNetworkBasicOps-server-566499225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-566499225',id=5,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEE3qfHT9nISnvSQTzYdD3SeSBu1w3zJLEgEWYnfIMG+c1PDvXrDOV6ARmzwkZ+NU4vuiwlmceQ1qxiZmo5l1snGqhlYlixBzIu76Ip6Om441/s9c1k67sCxOPk2okfbSw==',key_name='tempest-TestNetworkBasicOps-989980939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-fc0tnyu1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:00:19Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.214 187003 DEBUG nova.network.os_vif_util [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.214 187003 DEBUG nova.network.os_vif_util [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.215 187003 DEBUG nova.objects.instance [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.227 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <uuid>f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403</uuid>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <name>instance-00000005</name>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-566499225</nova:name>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:00:22</nova:creationTime>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         <nova:port uuid="de7f9d8a-48a4-4730-a774-102190ae66ef">
Nov 24 02:00:22 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <system>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="serial">f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="uuid">f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </system>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <os>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </os>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <features>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </features>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.config"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:59:8c:fa"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <target dev="tapde7f9d8a-48"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/console.log" append="off"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <video>
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </video>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:00:22 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:00:22 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:00:22 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:00:22 compute-0 nova_compute[186999]: </domain>
Nov 24 02:00:22 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.229 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Preparing to wait for external event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.230 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.230 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.230 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.231 187003 DEBUG nova.virt.libvirt.vif [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:00:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-566499225',display_name='tempest-TestNetworkBasicOps-server-566499225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-566499225',id=5,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEE3qfHT9nISnvSQTzYdD3SeSBu1w3zJLEgEWYnfIMG+c1PDvXrDOV6ARmzwkZ+NU4vuiwlmceQ1qxiZmo5l1snGqhlYlixBzIu76Ip6Om441/s9c1k67sCxOPk2okfbSw==',key_name='tempest-TestNetworkBasicOps-989980939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-fc0tnyu1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:00:19Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.232 187003 DEBUG nova.network.os_vif_util [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.233 187003 DEBUG nova.network.os_vif_util [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.233 187003 DEBUG os_vif [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.234 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.235 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.235 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.239 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.239 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde7f9d8a-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.240 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapde7f9d8a-48, col_values=(('external_ids', {'iface-id': 'de7f9d8a-48a4-4730-a774-102190ae66ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:8c:fa', 'vm-uuid': 'f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.241 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:22 compute-0 NetworkManager[55458]: <info>  [1763949622.2428] manager: (tapde7f9d8a-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.244 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.254 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.255 187003 INFO os_vif [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48')
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.314 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.314 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.315 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:59:8c:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.315 187003 INFO nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Using config drive
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.821 187003 INFO nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Creating config drive at /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.config
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.826 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0qh3ckll execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:00:22 compute-0 nova_compute[186999]: 2025-11-24 02:00:22.954 187003 DEBUG oslo_concurrency.processutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0qh3ckll" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:00:23 compute-0 kernel: tapde7f9d8a-48: entered promiscuous mode
Nov 24 02:00:23 compute-0 NetworkManager[55458]: <info>  [1763949623.0220] manager: (tapde7f9d8a-48): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.023 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:23 compute-0 ovn_controller[95380]: 2025-11-24T02:00:23Z|00070|binding|INFO|Claiming lport de7f9d8a-48a4-4730-a774-102190ae66ef for this chassis.
Nov 24 02:00:23 compute-0 ovn_controller[95380]: 2025-11-24T02:00:23Z|00071|binding|INFO|de7f9d8a-48a4-4730-a774-102190ae66ef: Claiming fa:16:3e:59:8c:fa 10.100.0.14
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.030 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:8c:fa 10.100.0.14'], port_security=['fa:16:3e:59:8c:fa 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b2607ce-da81-4f0e-9324-8381381e0e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c6a9db73-e658-460b-b661-e16ca8aaba94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84b01f9-1343-4122-aba2-6c3c243d2f99, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=de7f9d8a-48a4-4730-a774-102190ae66ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.033 104238 INFO neutron.agent.ovn.metadata.agent [-] Port de7f9d8a-48a4-4730-a774-102190ae66ef in datapath 5b2607ce-da81-4f0e-9324-8381381e0e39 bound to our chassis
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.034 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b2607ce-da81-4f0e-9324-8381381e0e39
Nov 24 02:00:23 compute-0 ovn_controller[95380]: 2025-11-24T02:00:23Z|00072|binding|INFO|Setting lport de7f9d8a-48a4-4730-a774-102190ae66ef ovn-installed in OVS
Nov 24 02:00:23 compute-0 ovn_controller[95380]: 2025-11-24T02:00:23Z|00073|binding|INFO|Setting lport de7f9d8a-48a4-4730-a774-102190ae66ef up in Southbound
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.039 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:23 compute-0 systemd-udevd[215345]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.061 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1df596e2-c3b8-483f-aa0e-799159ebddbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 systemd-machined[153319]: New machine qemu-5-instance-00000005.
Nov 24 02:00:23 compute-0 NetworkManager[55458]: <info>  [1763949623.0724] device (tapde7f9d8a-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:00:23 compute-0 NetworkManager[55458]: <info>  [1763949623.0734] device (tapde7f9d8a-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:00:23 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.095 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d7e0a0-2d4a-42fe-aa06-094265aa22a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.098 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[7ede406d-bdf6-414c-a3e1-d0b76780aedd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.127 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e413d2ec-f861-4e9f-a84e-24bcf4530e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.146 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6ecd3f-3e13-4a8b-8b76-911cb8bde2f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b2607ce-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:3d:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 305818, 'reachable_time': 26892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215357, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.166 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[940c092f-e03e-46e8-88fd-bc55dd9a6e86]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5b2607ce-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 305830, 'tstamp': 305830}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215359, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5b2607ce-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 305833, 'tstamp': 305833}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215359, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.168 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b2607ce-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.170 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.171 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.172 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b2607ce-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.172 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.173 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b2607ce-d0, col_values=(('external_ids', {'iface-id': '1d1ae020-8e04-4ae9-898c-f460d6d5fb29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:23.173 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.326 187003 DEBUG nova.compute.manager [req-9d446282-a6d2-44a6-ab06-c9912e424dd2 req-5c08f87d-0a8d-45f3-b412-4359fff68d0f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.326 187003 DEBUG oslo_concurrency.lockutils [req-9d446282-a6d2-44a6-ab06-c9912e424dd2 req-5c08f87d-0a8d-45f3-b412-4359fff68d0f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.326 187003 DEBUG oslo_concurrency.lockutils [req-9d446282-a6d2-44a6-ab06-c9912e424dd2 req-5c08f87d-0a8d-45f3-b412-4359fff68d0f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.327 187003 DEBUG oslo_concurrency.lockutils [req-9d446282-a6d2-44a6-ab06-c9912e424dd2 req-5c08f87d-0a8d-45f3-b412-4359fff68d0f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.327 187003 DEBUG nova.compute.manager [req-9d446282-a6d2-44a6-ab06-c9912e424dd2 req-5c08f87d-0a8d-45f3-b412-4359fff68d0f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Processing event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.410 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.411 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949623.410845, f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.412 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] VM Started (Lifecycle Event)
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.415 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.418 187003 INFO nova.virt.libvirt.driver [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Instance spawned successfully.
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.419 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.436 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.442 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.446 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.447 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.447 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.448 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.448 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.449 187003 DEBUG nova.virt.libvirt.driver [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.476 187003 DEBUG nova.network.neutron [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updated VIF entry in instance network info cache for port de7f9d8a-48a4-4730-a774-102190ae66ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.476 187003 DEBUG nova.network.neutron [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updating instance_info_cache with network_info: [{"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.478 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.479 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949623.4109826, f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.479 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] VM Paused (Lifecycle Event)
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.514 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.516 187003 DEBUG oslo_concurrency.lockutils [req-7eef8088-2e0d-4c15-8943-2a8c57fec817 req-83a55ae2-9021-413b-bf98-fe0de39051f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.518 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949623.4148934, f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.519 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] VM Resumed (Lifecycle Event)
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.526 187003 INFO nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Took 4.17 seconds to spawn the instance on the hypervisor.
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.527 187003 DEBUG nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.535 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.539 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.556 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.588 187003 INFO nova.compute.manager [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Took 4.61 seconds to build instance.
Nov 24 02:00:23 compute-0 nova_compute[186999]: 2025-11-24 02:00:23.613 187003 DEBUG oslo_concurrency.lockutils [None req-3bd0d61a-862b-4514-b9ba-89fd63805f06 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.407 187003 DEBUG nova.compute.manager [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.408 187003 DEBUG oslo_concurrency.lockutils [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.409 187003 DEBUG oslo_concurrency.lockutils [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.409 187003 DEBUG oslo_concurrency.lockutils [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.409 187003 DEBUG nova.compute.manager [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] No waiting events found dispatching network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:00:25 compute-0 nova_compute[186999]: 2025-11-24 02:00:25.410 187003 WARNING nova.compute.manager [req-5db77423-4362-4d86-b2d4-66596d0fa4fb req-4c24ab45-18a8-41fe-ac61-16b3bc2a75ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received unexpected event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef for instance with vm_state active and task_state None.
Nov 24 02:00:25 compute-0 podman[215367]: 2025-11-24 02:00:25.830070672 +0000 UTC m=+0.079818773 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 02:00:26 compute-0 nova_compute[186999]: 2025-11-24 02:00:26.115 187003 DEBUG nova.compute.manager [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-changed-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:26 compute-0 nova_compute[186999]: 2025-11-24 02:00:26.117 187003 DEBUG nova.compute.manager [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Refreshing instance network info cache due to event network-changed-de7f9d8a-48a4-4730-a774-102190ae66ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:00:26 compute-0 nova_compute[186999]: 2025-11-24 02:00:26.117 187003 DEBUG oslo_concurrency.lockutils [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:00:26 compute-0 nova_compute[186999]: 2025-11-24 02:00:26.117 187003 DEBUG oslo_concurrency.lockutils [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:00:26 compute-0 nova_compute[186999]: 2025-11-24 02:00:26.118 187003 DEBUG nova.network.neutron [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Refreshing network info cache for port de7f9d8a-48a4-4730-a774-102190ae66ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:00:27 compute-0 nova_compute[186999]: 2025-11-24 02:00:27.081 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:27 compute-0 nova_compute[186999]: 2025-11-24 02:00:27.243 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:27 compute-0 nova_compute[186999]: 2025-11-24 02:00:27.613 187003 DEBUG nova.network.neutron [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updated VIF entry in instance network info cache for port de7f9d8a-48a4-4730-a774-102190ae66ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:00:27 compute-0 nova_compute[186999]: 2025-11-24 02:00:27.614 187003 DEBUG nova.network.neutron [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updating instance_info_cache with network_info: [{"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:27 compute-0 nova_compute[186999]: 2025-11-24 02:00:27.632 187003 DEBUG oslo_concurrency.lockutils [req-781776df-afea-4420-b6ae-0202220582bc req-810506f7-c3b8-4e1d-a6d9-ce02f2c8037c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:00:29 compute-0 sshd-session[215388]: Invalid user sammy from 46.188.119.26 port 36794
Nov 24 02:00:29 compute-0 sshd-session[215388]: Received disconnect from 46.188.119.26 port 36794:11: Bye Bye [preauth]
Nov 24 02:00:29 compute-0 sshd-session[215388]: Disconnected from invalid user sammy 46.188.119.26 port 36794 [preauth]
Nov 24 02:00:31 compute-0 podman[215390]: 2025-11-24 02:00:31.80136161 +0000 UTC m=+0.050285887 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:00:32 compute-0 nova_compute[186999]: 2025-11-24 02:00:32.084 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:32 compute-0 nova_compute[186999]: 2025-11-24 02:00:32.276 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:36 compute-0 ovn_controller[95380]: 2025-11-24T02:00:36Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:8c:fa 10.100.0.14
Nov 24 02:00:36 compute-0 ovn_controller[95380]: 2025-11-24T02:00:36Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:8c:fa 10.100.0.14
Nov 24 02:00:36 compute-0 podman[215428]: 2025-11-24 02:00:36.812894272 +0000 UTC m=+0.054859175 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 02:00:37 compute-0 nova_compute[186999]: 2025-11-24 02:00:37.088 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:37 compute-0 nova_compute[186999]: 2025-11-24 02:00:37.279 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:41 compute-0 podman[215447]: 2025-11-24 02:00:41.843021976 +0000 UTC m=+0.087461707 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 02:00:42 compute-0 nova_compute[186999]: 2025-11-24 02:00:42.133 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:42 compute-0 nova_compute[186999]: 2025-11-24 02:00:42.281 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:42 compute-0 nova_compute[186999]: 2025-11-24 02:00:42.803 187003 INFO nova.compute.manager [None req-911a4648-b97b-4f09-a1d3-49253edfcdd5 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Get console output
Nov 24 02:00:42 compute-0 nova_compute[186999]: 2025-11-24 02:00:42.809 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.120 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.122 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.123 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.123 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.123 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.125 187003 INFO nova.compute.manager [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Terminating instance
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.126 187003 DEBUG nova.compute.manager [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:00:43 compute-0 kernel: tapde7f9d8a-48 (unregistering): left promiscuous mode
Nov 24 02:00:43 compute-0 NetworkManager[55458]: <info>  [1763949643.1597] device (tapde7f9d8a-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:00:43 compute-0 ovn_controller[95380]: 2025-11-24T02:00:43Z|00074|binding|INFO|Releasing lport de7f9d8a-48a4-4730-a774-102190ae66ef from this chassis (sb_readonly=0)
Nov 24 02:00:43 compute-0 ovn_controller[95380]: 2025-11-24T02:00:43Z|00075|binding|INFO|Setting lport de7f9d8a-48a4-4730-a774-102190ae66ef down in Southbound
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.215 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 ovn_controller[95380]: 2025-11-24T02:00:43Z|00076|binding|INFO|Removing iface tapde7f9d8a-48 ovn-installed in OVS
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.217 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.224 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:8c:fa 10.100.0.14'], port_security=['fa:16:3e:59:8c:fa 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b2607ce-da81-4f0e-9324-8381381e0e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c6a9db73-e658-460b-b661-e16ca8aaba94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.206'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84b01f9-1343-4122-aba2-6c3c243d2f99, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=de7f9d8a-48a4-4730-a774-102190ae66ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.226 104238 INFO neutron.agent.ovn.metadata.agent [-] Port de7f9d8a-48a4-4730-a774-102190ae66ef in datapath 5b2607ce-da81-4f0e-9324-8381381e0e39 unbound from our chassis
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.227 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b2607ce-da81-4f0e-9324-8381381e0e39
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.230 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 24 02:00:43 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.634s CPU time.
Nov 24 02:00:43 compute-0 systemd-machined[153319]: Machine qemu-5-instance-00000005 terminated.
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.249 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[99f1bf0c-a413-4fe8-a336-0354da37b229]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 podman[215469]: 2025-11-24 02:00:43.269494087 +0000 UTC m=+0.070951066 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.287 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[67af510c-9a1e-4773-b2b1-fc5c84bc2d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.291 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd1c0fb-85e6-47a4-b0c6-63fdff129126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.327 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[5c63e17d-7b20-4a8b-b399-4be9554dd639]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 podman[215480]: 2025-11-24 02:00:43.336331576 +0000 UTC m=+0.091566112 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.347 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.352 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.354 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1342c0-b4e5-4fef-94fa-bc09704bcefa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b2607ce-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:3d:fd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 305818, 'reachable_time': 26892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215528, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.373 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[55e647d8-685c-4ca5-8ba2-70fbddfa45e4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5b2607ce-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 305830, 'tstamp': 305830}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215535, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5b2607ce-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 305833, 'tstamp': 305833}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215535, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.375 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b2607ce-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.376 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.381 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.382 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b2607ce-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.382 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.382 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b2607ce-d0, col_values=(('external_ids', {'iface-id': '1d1ae020-8e04-4ae9-898c-f460d6d5fb29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:43.382 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.394 187003 INFO nova.virt.libvirt.driver [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Instance destroyed successfully.
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.395 187003 DEBUG nova.objects.instance [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.407 187003 DEBUG nova.virt.libvirt.vif [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:00:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-566499225',display_name='tempest-TestNetworkBasicOps-server-566499225',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-566499225',id=5,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEE3qfHT9nISnvSQTzYdD3SeSBu1w3zJLEgEWYnfIMG+c1PDvXrDOV6ARmzwkZ+NU4vuiwlmceQ1qxiZmo5l1snGqhlYlixBzIu76Ip6Om441/s9c1k67sCxOPk2okfbSw==',key_name='tempest-TestNetworkBasicOps-989980939',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:00:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-fc0tnyu1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:00:23Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.407 187003 DEBUG nova.network.os_vif_util [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "de7f9d8a-48a4-4730-a774-102190ae66ef", "address": "fa:16:3e:59:8c:fa", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde7f9d8a-48", "ovs_interfaceid": "de7f9d8a-48a4-4730-a774-102190ae66ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.408 187003 DEBUG nova.network.os_vif_util [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.408 187003 DEBUG os_vif [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.410 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.410 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde7f9d8a-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.411 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.413 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.415 187003 INFO os_vif [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:8c:fa,bridge_name='br-int',has_traffic_filtering=True,id=de7f9d8a-48a4-4730-a774-102190ae66ef,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde7f9d8a-48')
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.415 187003 INFO nova.virt.libvirt.driver [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Deleting instance files /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403_del
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.416 187003 INFO nova.virt.libvirt.driver [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Deletion of /var/lib/nova/instances/f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403_del complete
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.464 187003 INFO nova.compute.manager [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.465 187003 DEBUG oslo.service.loopingcall [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.465 187003 DEBUG nova.compute.manager [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.465 187003 DEBUG nova.network.neutron [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.882 187003 DEBUG nova.compute.manager [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-unplugged-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.883 187003 DEBUG oslo_concurrency.lockutils [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.883 187003 DEBUG oslo_concurrency.lockutils [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.883 187003 DEBUG oslo_concurrency.lockutils [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.883 187003 DEBUG nova.compute.manager [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] No waiting events found dispatching network-vif-unplugged-de7f9d8a-48a4-4730-a774-102190ae66ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:00:43 compute-0 nova_compute[186999]: 2025-11-24 02:00:43.884 187003 DEBUG nova.compute.manager [req-76257913-df67-4453-9262-b48c1cfbff52 req-6dda2c51-ace6-4cab-8aa4-29bc319a36f6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-unplugged-de7f9d8a-48a4-4730-a774-102190ae66ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.605 187003 DEBUG nova.network.neutron [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.622 187003 INFO nova.compute.manager [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Took 1.16 seconds to deallocate network for instance.
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.657 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.658 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.734 187003 DEBUG nova.compute.provider_tree [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.746 187003 DEBUG nova.scheduler.client.report [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.762 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.802 187003 INFO nova.scheduler.client.report [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403
Nov 24 02:00:44 compute-0 nova_compute[186999]: 2025-11-24 02:00:44.859 187003 DEBUG oslo_concurrency.lockutils [None req-72175f27-e45f-487d-bd81-b574c5619949 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.962 187003 DEBUG nova.compute.manager [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.963 187003 DEBUG oslo_concurrency.lockutils [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.963 187003 DEBUG oslo_concurrency.lockutils [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.963 187003 DEBUG oslo_concurrency.lockutils [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.964 187003 DEBUG nova.compute.manager [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] No waiting events found dispatching network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.964 187003 WARNING nova.compute.manager [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received unexpected event network-vif-plugged-de7f9d8a-48a4-4730-a774-102190ae66ef for instance with vm_state deleted and task_state None.
Nov 24 02:00:45 compute-0 nova_compute[186999]: 2025-11-24 02:00:45.964 187003 DEBUG nova.compute.manager [req-1cd51914-b7c9-4905-a493-22e515701f79 req-cc158af9-e327-4bb7-9ffd-4d4ca67627b9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Received event network-vif-deleted-de7f9d8a-48a4-4730-a774-102190ae66ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.113 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.114 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.114 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.115 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.115 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.116 187003 INFO nova.compute.manager [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Terminating instance
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.117 187003 DEBUG nova.compute.manager [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:00:46 compute-0 kernel: tapcacdcf87-dd (unregistering): left promiscuous mode
Nov 24 02:00:46 compute-0 NetworkManager[55458]: <info>  [1763949646.1403] device (tapcacdcf87-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:00:46 compute-0 ovn_controller[95380]: 2025-11-24T02:00:46Z|00077|binding|INFO|Releasing lport cacdcf87-dd20-4393-8ddb-3e196f1bec94 from this chassis (sb_readonly=0)
Nov 24 02:00:46 compute-0 ovn_controller[95380]: 2025-11-24T02:00:46Z|00078|binding|INFO|Setting lport cacdcf87-dd20-4393-8ddb-3e196f1bec94 down in Southbound
Nov 24 02:00:46 compute-0 ovn_controller[95380]: 2025-11-24T02:00:46Z|00079|binding|INFO|Removing iface tapcacdcf87-dd ovn-installed in OVS
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.147 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.153 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:04:71 10.100.0.10'], port_security=['fa:16:3e:77:04:71 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '80db970f-9856-4c4d-9dff-715dafb6a925', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b2607ce-da81-4f0e-9324-8381381e0e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c169b27d-da28-49aa-b924-137f9275f851', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84b01f9-1343-4122-aba2-6c3c243d2f99, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=cacdcf87-dd20-4393-8ddb-3e196f1bec94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.154 104238 INFO neutron.agent.ovn.metadata.agent [-] Port cacdcf87-dd20-4393-8ddb-3e196f1bec94 in datapath 5b2607ce-da81-4f0e-9324-8381381e0e39 unbound from our chassis
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.155 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b2607ce-da81-4f0e-9324-8381381e0e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.156 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9d32f891-cd20-4f61-8784-7d35d6c684d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.157 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39 namespace which is not needed anymore
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.164 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 24 02:00:46 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 14.250s CPU time.
Nov 24 02:00:46 compute-0 systemd-machined[153319]: Machine qemu-4-instance-00000004 terminated.
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [NOTICE]   (215118) : haproxy version is 2.8.14-c23fe91
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [NOTICE]   (215118) : path to executable is /usr/sbin/haproxy
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [WARNING]  (215118) : Exiting Master process...
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [WARNING]  (215118) : Exiting Master process...
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [ALERT]    (215118) : Current worker (215120) exited with code 143 (Terminated)
Nov 24 02:00:46 compute-0 neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39[215114]: [WARNING]  (215118) : All workers exited. Exiting... (0)
Nov 24 02:00:46 compute-0 systemd[1]: libpod-4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684.scope: Deactivated successfully.
Nov 24 02:00:46 compute-0 podman[215570]: 2025-11-24 02:00:46.302287631 +0000 UTC m=+0.043453147 container died 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 02:00:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684-userdata-shm.mount: Deactivated successfully.
Nov 24 02:00:46 compute-0 NetworkManager[55458]: <info>  [1763949646.3374] manager: (tapcacdcf87-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 24 02:00:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-2d96b22ecfded2be3916f76e925867669ac0016308df7bafae711afeba87fe76-merged.mount: Deactivated successfully.
Nov 24 02:00:46 compute-0 podman[215570]: 2025-11-24 02:00:46.347283399 +0000 UTC m=+0.088448915 container cleanup 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 02:00:46 compute-0 systemd[1]: libpod-conmon-4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684.scope: Deactivated successfully.
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.379 187003 INFO nova.virt.libvirt.driver [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Instance destroyed successfully.
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.380 187003 DEBUG nova.objects.instance [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 80db970f-9856-4c4d-9dff-715dafb6a925 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.390 187003 DEBUG nova.virt.libvirt.vif [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T01:59:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254954222',display_name='tempest-TestNetworkBasicOps-server-1254954222',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254954222',id=4,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDkXTipxsu/cfKZkE2eA0ypez4hb7rz642IdxeUa3rRPrHPGPnY5t1ePwbUwP8GFtOsbs0xgrHWovHTVxfpoOWKLNHCgpC8dIF9IGf7s2bWpbN1Ejn3j98rbyN/+l5kDnA==',key_name='tempest-TestNetworkBasicOps-1290362523',keypairs=<?>,launch_index=0,launched_at=2025-11-24T01:59:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-v1qedma7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T01:59:52Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=80db970f-9856-4c4d-9dff-715dafb6a925,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.391 187003 DEBUG nova.network.os_vif_util [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "address": "fa:16:3e:77:04:71", "network": {"id": "5b2607ce-da81-4f0e-9324-8381381e0e39", "bridge": "br-int", "label": "tempest-network-smoke--1678846235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcacdcf87-dd", "ovs_interfaceid": "cacdcf87-dd20-4393-8ddb-3e196f1bec94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.392 187003 DEBUG nova.network.os_vif_util [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.392 187003 DEBUG os_vif [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.394 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.394 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcacdcf87-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.396 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.399 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.401 187003 INFO os_vif [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:04:71,bridge_name='br-int',has_traffic_filtering=True,id=cacdcf87-dd20-4393-8ddb-3e196f1bec94,network=Network(5b2607ce-da81-4f0e-9324-8381381e0e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcacdcf87-dd')
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.402 187003 INFO nova.virt.libvirt.driver [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Deleting instance files /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925_del
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.403 187003 INFO nova.virt.libvirt.driver [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Deletion of /var/lib/nova/instances/80db970f-9856-4c4d-9dff-715dafb6a925_del complete
Nov 24 02:00:46 compute-0 podman[215608]: 2025-11-24 02:00:46.430146547 +0000 UTC m=+0.057294293 container remove 4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.438 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[dd54442a-b62c-4777-a141-6ec54efd4e97]: (4, ('Mon Nov 24 02:00:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39 (4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684)\n4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684\nMon Nov 24 02:00:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39 (4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684)\n4dc816878f8e3566c2a45a918f47d8b8d7ade3f81409b6a231535ff4ae86a684\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.439 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e53a47ba-df90-4c44-b9a0-3a05c2696c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.441 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b2607ce-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:46 compute-0 kernel: tap5b2607ce-d0: left promiscuous mode
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.445 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.449 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6073cade-2135-4544-9c72-19fe3e9f83f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.452 187003 INFO nova.compute.manager [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.453 187003 DEBUG oslo.service.loopingcall [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.453 187003 DEBUG nova.compute.manager [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.454 187003 DEBUG nova.network.neutron [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.468 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.481 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[5d10d216-31a5-4265-8185-49a40575c37c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.483 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[294706cc-b5c7-445a-b3cc-cf719d3cba52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.499 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[51740ba8-c96e-42e1-b823-74aa84e3484f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 305811, 'reachable_time': 20499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215630, 'error': None, 'target': 'ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.501 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5b2607ce-da81-4f0e-9324-8381381e0e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:00:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:46.502 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[25eaf987-5f12-4348-99a1-19a4f858ccd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:00:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d5b2607ce\x2dda81\x2d4f0e\x2d9324\x2d8381381e0e39.mount: Deactivated successfully.
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.924 187003 DEBUG nova.network.neutron [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:00:46 compute-0 nova_compute[186999]: 2025-11-24 02:00:46.974 187003 INFO nova.compute.manager [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Took 0.52 seconds to deallocate network for instance.
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.027 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.028 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.080 187003 DEBUG nova.compute.provider_tree [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.095 187003 DEBUG nova.scheduler.client.report [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.120 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.136 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.146 187003 INFO nova.scheduler.client.report [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 80db970f-9856-4c4d-9dff-715dafb6a925
Nov 24 02:00:47 compute-0 nova_compute[186999]: 2025-11-24 02:00:47.198 187003 DEBUG oslo_concurrency.lockutils [None req-a3d857af-42d8-48a4-a763-93564b8dcf21 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.041 187003 DEBUG nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-vif-unplugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.042 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.042 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.042 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.043 187003 DEBUG nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] No waiting events found dispatching network-vif-unplugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.043 187003 WARNING nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received unexpected event network-vif-unplugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 for instance with vm_state deleted and task_state None.
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.043 187003 DEBUG nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.043 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.043 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.044 187003 DEBUG oslo_concurrency.lockutils [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "80db970f-9856-4c4d-9dff-715dafb6a925-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.044 187003 DEBUG nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] No waiting events found dispatching network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.044 187003 WARNING nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received unexpected event network-vif-plugged-cacdcf87-dd20-4393-8ddb-3e196f1bec94 for instance with vm_state deleted and task_state None.
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.044 187003 DEBUG nova.compute.manager [req-9ab576cb-d85c-4a10-b6b5-3d6b60aa466f req-a9b3dc6a-474b-47a7-8a4f-cc0e82c8ed95 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Received event network-vif-deleted-cacdcf87-dd20-4393-8ddb-3e196f1bec94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:00:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:48.421 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:00:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:48.422 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:00:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:48.422 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:00:48 compute-0 nova_compute[186999]: 2025-11-24 02:00:48.722 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:48.723 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:00:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:48.724 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:00:51 compute-0 nova_compute[186999]: 2025-11-24 02:00:51.400 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:52 compute-0 nova_compute[186999]: 2025-11-24 02:00:52.175 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:52 compute-0 podman[215631]: 2025-11-24 02:00:52.838458181 +0000 UTC m=+0.084121584 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:00:53 compute-0 nova_compute[186999]: 2025-11-24 02:00:53.730 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:53 compute-0 nova_compute[186999]: 2025-11-24 02:00:53.837 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:56 compute-0 nova_compute[186999]: 2025-11-24 02:00:56.404 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:56 compute-0 podman[215652]: 2025-11-24 02:00:56.808270337 +0000 UTC m=+0.052636608 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 24 02:00:57 compute-0 nova_compute[186999]: 2025-11-24 02:00:57.178 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:00:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:00:57.726 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:00:58 compute-0 nova_compute[186999]: 2025-11-24 02:00:58.394 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949643.392527, f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:00:58 compute-0 nova_compute[186999]: 2025-11-24 02:00:58.394 187003 INFO nova.compute.manager [-] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] VM Stopped (Lifecycle Event)
Nov 24 02:00:58 compute-0 nova_compute[186999]: 2025-11-24 02:00:58.416 187003 DEBUG nova.compute.manager [None req-7f884bb3-8415-4aa4-8a5b-a8c762c022fc - - - - - -] [instance: f77f8ac5-c08e-4b1a-8cae-f4ea5c20f403] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:01 compute-0 nova_compute[186999]: 2025-11-24 02:01:01.377 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949646.375965, 80db970f-9856-4c4d-9dff-715dafb6a925 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:01 compute-0 nova_compute[186999]: 2025-11-24 02:01:01.377 187003 INFO nova.compute.manager [-] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] VM Stopped (Lifecycle Event)
Nov 24 02:01:01 compute-0 nova_compute[186999]: 2025-11-24 02:01:01.394 187003 DEBUG nova.compute.manager [None req-ac4cce70-137a-4c8d-a0aa-62e1051dd354 - - - - - -] [instance: 80db970f-9856-4c4d-9dff-715dafb6a925] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:01 compute-0 nova_compute[186999]: 2025-11-24 02:01:01.409 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:01 compute-0 CROND[215675]: (root) CMD (run-parts /etc/cron.hourly)
Nov 24 02:01:01 compute-0 run-parts[215678]: (/etc/cron.hourly) starting 0anacron
Nov 24 02:01:02 compute-0 anacron[215686]: Anacron started on 2025-11-24
Nov 24 02:01:02 compute-0 anacron[215686]: Will run job `cron.daily' in 13 min.
Nov 24 02:01:02 compute-0 anacron[215686]: Will run job `cron.weekly' in 33 min.
Nov 24 02:01:02 compute-0 anacron[215686]: Will run job `cron.monthly' in 53 min.
Nov 24 02:01:02 compute-0 anacron[215686]: Jobs will be executed sequentially
Nov 24 02:01:02 compute-0 run-parts[215688]: (/etc/cron.hourly) finished 0anacron
Nov 24 02:01:02 compute-0 CROND[215674]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 24 02:01:02 compute-0 nova_compute[186999]: 2025-11-24 02:01:02.180 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:02 compute-0 podman[215689]: 2025-11-24 02:01:02.829505031 +0000 UTC m=+0.069223658 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.202 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.203 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.216 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.287 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.287 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.297 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.298 187003 INFO nova.compute.claims [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.410 187003 DEBUG nova.compute.provider_tree [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.412 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.423 187003 DEBUG nova.scheduler.client.report [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.444 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.445 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.488 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.489 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.504 187003 INFO nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.517 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.601 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.603 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.603 187003 INFO nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Creating image(s)
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.604 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.604 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.605 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.622 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.666 187003 DEBUG nova.policy [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.690 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.691 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.692 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.702 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.767 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.769 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.796 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.812 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.813 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.813 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.901 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.903 187003 DEBUG nova.virt.disk.api [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.903 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.970 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.971 187003 DEBUG nova.virt.disk.api [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.971 187003 DEBUG nova.objects.instance [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.986 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.987 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Ensure instance console log exists: /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.988 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.988 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:06 compute-0 nova_compute[186999]: 2025-11-24 02:01:06.988 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:07 compute-0 nova_compute[186999]: 2025-11-24 02:01:07.183 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:07 compute-0 nova_compute[186999]: 2025-11-24 02:01:07.590 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Successfully created port: 175bb896-4ccd-40b1-8746-160b190ce3fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:01:07 compute-0 podman[215729]: 2025-11-24 02:01:07.79268508 +0000 UTC m=+0.048243025 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.835 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Successfully updated port: 175bb896-4ccd-40b1-8746-160b190ce3fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.846 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.847 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.847 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.908 187003 DEBUG nova.compute.manager [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.909 187003 DEBUG nova.compute.manager [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing instance network info cache due to event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.909 187003 DEBUG oslo_concurrency.lockutils [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:08 compute-0 nova_compute[186999]: 2025-11-24 02:01:08.957 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.783 187003 DEBUG nova.network.neutron [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.810 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.811 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance network_info: |[{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.813 187003 DEBUG oslo_concurrency.lockutils [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.813 187003 DEBUG nova.network.neutron [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.820 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Start _get_guest_xml network_info=[{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.827 187003 WARNING nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.835 187003 DEBUG nova.virt.libvirt.host [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.836 187003 DEBUG nova.virt.libvirt.host [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.839 187003 DEBUG nova.virt.libvirt.host [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.840 187003 DEBUG nova.virt.libvirt.host [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.841 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.841 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.842 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.842 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.843 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.843 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.843 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.844 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.844 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.844 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.845 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.845 187003 DEBUG nova.virt.hardware [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.851 187003 DEBUG nova.virt.libvirt.vif [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:01:06Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.852 187003 DEBUG nova.network.os_vif_util [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.853 187003 DEBUG nova.network.os_vif_util [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.854 187003 DEBUG nova.objects.instance [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.867 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <uuid>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</uuid>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <name>instance-00000006</name>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:01:09</nova:creationTime>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:01:09 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <system>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="serial">a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="uuid">a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </system>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <os>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </os>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <features>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </features>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:95:60:76"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <target dev="tap175bb896-4c"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log" append="off"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <video>
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </video>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:01:09 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:01:09 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:01:09 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:01:09 compute-0 nova_compute[186999]: </domain>
Nov 24 02:01:09 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.869 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Preparing to wait for external event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.870 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.870 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.870 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.871 187003 DEBUG nova.virt.libvirt.vif [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:01:06Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.872 187003 DEBUG nova.network.os_vif_util [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.872 187003 DEBUG nova.network.os_vif_util [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.873 187003 DEBUG os_vif [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.874 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.874 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.874 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.878 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.878 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap175bb896-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.879 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap175bb896-4c, col_values=(('external_ids', {'iface-id': '175bb896-4ccd-40b1-8746-160b190ce3fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:60:76', 'vm-uuid': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.881 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.883 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:01:09 compute-0 NetworkManager[55458]: <info>  [1763949669.8833] manager: (tap175bb896-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.890 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.891 187003 INFO os_vif [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c')
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.940 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.941 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.942 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:95:60:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:01:09 compute-0 nova_compute[186999]: 2025-11-24 02:01:09.942 187003 INFO nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Using config drive
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.339 187003 INFO nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Creating config drive at /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.344 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7ir8sxw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.471 187003 DEBUG oslo_concurrency.processutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7ir8sxw" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:10 compute-0 kernel: tap175bb896-4c: entered promiscuous mode
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.5434] manager: (tap175bb896-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 24 02:01:10 compute-0 ovn_controller[95380]: 2025-11-24T02:01:10Z|00080|binding|INFO|Claiming lport 175bb896-4ccd-40b1-8746-160b190ce3fc for this chassis.
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.543 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 ovn_controller[95380]: 2025-11-24T02:01:10Z|00081|binding|INFO|175bb896-4ccd-40b1-8746-160b190ce3fc: Claiming fa:16:3e:95:60:76 10.100.0.11
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.549 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.553 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.567 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:60:76 10.100.0.11'], port_security=['fa:16:3e:95:60:76 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3a228f-1352-43c0-b602-704afca624c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f1d665d3-744d-426a-8fc5-4bea51a25946', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97327433-796a-4849-8d2d-30cf53b4e27b, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=175bb896-4ccd-40b1-8746-160b190ce3fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.568 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 175bb896-4ccd-40b1-8746-160b190ce3fc in datapath cc3a228f-1352-43c0-b602-704afca624c0 bound to our chassis
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.569 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cc3a228f-1352-43c0-b602-704afca624c0
Nov 24 02:01:10 compute-0 systemd-udevd[215768]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.583 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c827fc92-ee06-420d-8676-7f4d271b175b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.585 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcc3a228f-11 in ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.586 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcc3a228f-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.586 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[66db7756-f44d-49fc-a8bf-239fbf37c309]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.589 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[57feed1a-5208-49ef-a803-a154b3f029fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.5982] device (tap175bb896-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.5990] device (tap175bb896-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:01:10 compute-0 systemd-machined[153319]: New machine qemu-6-instance-00000006.
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.604 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[1b20ecea-182f-4f0f-b5af-44e745afd496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.610 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 24 02:01:10 compute-0 ovn_controller[95380]: 2025-11-24T02:01:10Z|00082|binding|INFO|Setting lport 175bb896-4ccd-40b1-8746-160b190ce3fc ovn-installed in OVS
Nov 24 02:01:10 compute-0 ovn_controller[95380]: 2025-11-24T02:01:10Z|00083|binding|INFO|Setting lport 175bb896-4ccd-40b1-8746-160b190ce3fc up in Southbound
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.614 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.634 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f27f077b-741d-4670-917b-8c3a7a68461a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.670 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[7caa9b5f-a39f-4d58-a4e4-2aa4c7cdd7cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.676 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a59900ae-d501-4e57-b84a-4ff84a89df86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.6773] manager: (tapcc3a228f-10): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.709 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[d1670f70-12e6-4b31-8412-becec56fd69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.712 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[88a11ba6-76ad-430a-a6b4-f70b4bc534c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.7352] device (tapcc3a228f-10): carrier: link connected
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.742 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[28beee57-2b6f-4de6-a929-672d05fbee33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.758 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb83e95-0824-4eef-a60c-f22968a33914]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3a228f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:42:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313739, 'reachable_time': 40838, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215801, 'error': None, 'target': 'ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.783 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[195a8fbc-c2fb-4101-ab2a-a3a8591b10a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:4291'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 313739, 'tstamp': 313739}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215802, 'error': None, 'target': 'ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.793 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.794 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.795 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.795 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.810 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[237cffe9-7e31-4bbd-8d14-52573b2f30cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcc3a228f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:42:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313739, 'reachable_time': 40838, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215803, 'error': None, 'target': 'ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.852 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[68d5172a-498a-4a8c-bf2f-56996686c838]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.864 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.932 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.933 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.938 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[659577f5-a7e3-47d5-93b2-6561c37a4cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.940 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3a228f-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.941 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.942 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc3a228f-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:10 compute-0 NetworkManager[55458]: <info>  [1763949670.9451] manager: (tapcc3a228f-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 24 02:01:10 compute-0 kernel: tapcc3a228f-10: entered promiscuous mode
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.948 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcc3a228f-10, col_values=(('external_ids', {'iface-id': '3fd96984-65bc-4f5c-892b-b6485ade3b7a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:10 compute-0 ovn_controller[95380]: 2025-11-24T02:01:10Z|00084|binding|INFO|Releasing lport 3fd96984-65bc-4f5c-892b-b6485ade3b7a from this chassis (sb_readonly=0)
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.950 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cc3a228f-1352-43c0-b602-704afca624c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cc3a228f-1352-43c0-b602-704afca624c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.951 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[272cd3bb-a6f5-43b9-a905-a12edf9c39e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.952 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-cc3a228f-1352-43c0-b602-704afca624c0
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/cc3a228f-1352-43c0-b602-704afca624c0.pid.haproxy
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID cc3a228f-1352-43c0-b602-704afca624c0
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:01:10 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:10.953 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0', 'env', 'PROCESS_TAG=haproxy-cc3a228f-1352-43c0-b602-704afca624c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cc3a228f-1352-43c0-b602-704afca624c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.956 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.989 187003 DEBUG nova.compute.manager [req-26dcf8af-b82e-4ba6-a7c6-3b08b96a5569 req-b6aa2a34-a6d6-43cb-ae53-799e4d2febaa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.990 187003 DEBUG oslo_concurrency.lockutils [req-26dcf8af-b82e-4ba6-a7c6-3b08b96a5569 req-b6aa2a34-a6d6-43cb-ae53-799e4d2febaa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.991 187003 DEBUG oslo_concurrency.lockutils [req-26dcf8af-b82e-4ba6-a7c6-3b08b96a5569 req-b6aa2a34-a6d6-43cb-ae53-799e4d2febaa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.991 187003 DEBUG oslo_concurrency.lockutils [req-26dcf8af-b82e-4ba6-a7c6-3b08b96a5569 req-b6aa2a34-a6d6-43cb-ae53-799e4d2febaa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:10 compute-0 nova_compute[186999]: 2025-11-24 02:01:10.991 187003 DEBUG nova.compute.manager [req-26dcf8af-b82e-4ba6-a7c6-3b08b96a5569 req-b6aa2a34-a6d6-43cb-ae53-799e4d2febaa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Processing event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.006 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.037 187003 DEBUG nova.network.neutron [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated VIF entry in instance network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.038 187003 DEBUG nova.network.neutron [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.052 187003 DEBUG oslo_concurrency.lockutils [req-41b79567-6188-4469-b02e-8dec6c320590 req-6ccc56c3-20b5-425f-b3e7-53577e2f0626 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.144 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949671.1435668, a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.145 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] VM Started (Lifecycle Event)
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.149 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.154 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.160 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.162 187003 INFO nova.virt.libvirt.driver [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance spawned successfully.
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.163 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.165 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.183 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.184 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949671.1440046, a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.185 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] VM Paused (Lifecycle Event)
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.191 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.192 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.193 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.193 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.194 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.194 187003 DEBUG nova.virt.libvirt.driver [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.219 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.223 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949671.1540134, a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.223 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] VM Resumed (Lifecycle Event)
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.248 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.250 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5765MB free_disk=73.4593734741211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.250 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.250 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.251 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'name': 'tempest-TestNetworkBasicOps-server-451741380', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'hostId': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.252 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>]
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.253 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.256 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 / tap175bb896-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.257 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36c1c057-4cc5-4b56-bd1a-8a42056b00a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.253504', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '73131f2a-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': 'd4533d1a79c1d666379f2104b1cd5ef390888a14d8ac6e622521b8762bfd4e1b'}]}, 'timestamp': '2025-11-24 02:01:11.258001', '_unique_id': '2f59698bec024584bb1c8ed1e1cfaee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.259 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.260 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.260 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6504dd5c-16c0-4a98-ae5e-375c43277f64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.260500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731393ce-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '0dfcb69bd9a3df5f6dd26c42effc8500baef3b85d44f0012ae0281968d4efa3f'}]}, 'timestamp': '2025-11-24 02:01:11.260833', '_unique_id': 'a607b38b85d94f3f942900c8855d99c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.273 187003 INFO nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Took 4.67 seconds to spawn the instance on the hypervisor.
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.273 187003 DEBUG nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.288 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.289 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48e8b8c2-030f-4556-9545-470358f9ed08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.262407', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7317ed20-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '094281ff4bea5aef689838dd51db732ab2f7b4a757ad86edae26fb49a67af1a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.262407', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7317f9aa-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': 'e535f4965c3ac3eda8d2b86d84251cb1366fdcb49a50b13e8568cf9e7a8b551f'}]}, 'timestamp': '2025-11-24 02:01:11.289630', '_unique_id': '399d07c5dce64bfbaabf333e03525d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.291 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.291 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38fd422e-c4e7-49fd-a337-027c5f152b14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.291739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '7318581e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '029db2aa002fb85751bbe6ab12fc49ddfa4e6194fc13975c39d2d6861a675812'}]}, 'timestamp': '2025-11-24 02:01:11.292020', '_unique_id': 'dd3e422e50ac49228ba1eaa7375da08f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.293 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.293 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.293 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3bba333-5142-4f89-9bdd-d76466ef9117', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.293268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7318932e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '335e36a4f2cf8d431c3fc5f3320c16a8820376052259d8cf837ffe33c9e37a82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.293268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '73189d10-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': 'fb9b5bd689e3d90e9a87fee3f7f5507ea630baba3af2f5d75a0f838c5e739ee5'}]}, 'timestamp': '2025-11-24 02:01:11.293800', '_unique_id': '19d70fc9e2d04f638ee1f3e5080906c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.302 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.304 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.304 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c18a7cfb-0ec2-4153-8815-f5eb5d521d3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.294968', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '731a4d72-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': 'bf4e5282652b418dc987d98be9f0011d3e90f8696e34ccf475e58bf6530ed1bf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.294968', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '731a5696-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': 'c8d375d2caa8945316346a332a0af06dea88beba7bb04454f5e830d13dfc2be8'}]}, 'timestamp': '2025-11-24 02:01:11.305072', '_unique_id': 'e4e92b11309b4e77bdabda0563896baa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.306 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.306 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd66d822e-046f-4c96-be1d-3dd26e0401b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.306378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '731a937c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': 'a89ace3e738f98cb27964ec43d4706e93391b57760da8b86b889db427be82ace'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.306378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '731a9dc2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': 'd62bfd6c5e33934da416bbf2399556c42a8eb685ea5ab25386a26126507a91f7'}]}, 'timestamp': '2025-11-24 02:01:11.306918', '_unique_id': '41ade478956f413ea6bdd436a18861df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.308 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.308 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>]
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.308 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0eadde-cb29-4887-ba36-cdd85b33ecf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.308391', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731ae0de-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': 'a8b37f50c654527572545a39ebcec03cd54f27f7dea72c5759db4f4a95292c75'}]}, 'timestamp': '2025-11-24 02:01:11.308625', '_unique_id': '3658e95ef21945849f84554d984242f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.309 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.327 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.327 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1: ceilometer.compute.pollsters.NoVolumeException
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.327 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.327 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beaf550b-71a7-4eb6-a91f-f1df8b350980', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.327725', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731dda6e-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '87584ad142719668d58ce8b41a645ecd46dff797e04984ce9aa71b4c17737016'}]}, 'timestamp': '2025-11-24 02:01:11.328269', '_unique_id': 'e1e759b243f240489b66858e28b2f49b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.329 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.330 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.330 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>]
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.331 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a4011e2-87d7-448b-9f1e-998a0a79348d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.331050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731e587c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '3e435a21ecb2b978ac61ec06d5a12ba6d636104f3bf2ed89f5d58301a1de54bf'}]}, 'timestamp': '2025-11-24 02:01:11.331426', '_unique_id': 'a4db8c2a63324d2e9d9f7c2a8eb281fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.332 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.333 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.333 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.333 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '554e4255-ff47-43a9-a62a-bf9a49b8a82b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.333183', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '731eaaa2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': 'f7e409ab6adbb80eedb181174e4b21e8ad5187d7299ff93c911b8fa812b0a2dc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.333183', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '731ec550-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': 'd0628eb7533bc7bad65760ee3430ab17135018904c6f8fa6d2c7f2f51b87ec67'}]}, 'timestamp': '2025-11-24 02:01:11.334172', '_unique_id': '58b4e20815f045a3a94919905fbd9737'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86fc3123-adb5-4a2c-aa7b-052427888ad3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.336036', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731f1b2c-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '80868cb3feebbda087482445db44f6161a9e7d99fb97f93e7067335a001cf74f'}]}, 'timestamp': '2025-11-24 02:01:11.336364', '_unique_id': '1f93f71f6a4a48b89f1822aedd9fcc65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.336 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.337 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.337 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6198b009-545d-42d9-a645-fc5712db9fba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.337498', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '731f51d2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '1cc03cdaf3bd8589737d58967a817195647201ab8d8104748140ffa34170c946'}]}, 'timestamp': '2025-11-24 02:01:11.337730', '_unique_id': '96051cd1b07e48b3a7f858a2f689c97f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.338 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/cpu volume: 130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9f00039-c1c0-44a2-9eb3-afe26ca9403d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 130000000, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'timestamp': '2025-11-24T02:01:11.338851', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '731f87d8-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.043229216, 'message_signature': 'b07bc39d41e66fc0b00156dbde7e23d211f20a4bfbf5741cb38b4e5a5be51ac6'}]}, 'timestamp': '2025-11-24 02:01:11.339106', '_unique_id': '69cdf49b579a492fb4399de447ac7ecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.340 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.340 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e88cc99b-6777-47da-8126-7ee1e77f5828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.340203', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '731fbb72-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '33ad5d679b6c39dd182cb6cd27699e0315d45f8ac854c2550106c2979fe08194'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.340203', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '731fc324-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '009f8d8e074bb8c36b90c817915367d92b20733c259a6fdc55b1d934f5219b3a'}]}, 'timestamp': '2025-11-24 02:01:11.340617', '_unique_id': '07a6faf8f7394d60a0d913a905739c64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.341 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '732680fd-df83-45ce-ba6d-8a9b68eb34d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.341804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '731ffaa6-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '145510a061223f4b2bd19779e9707bedc2bc92697f14560f760635724823d4fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.341804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '732004ec-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': 'd9276cfcc074847a71bdd957918ef281d3719e6d66a60c9d6e5474ba54c68274'}]}, 'timestamp': '2025-11-24 02:01:11.342333', '_unique_id': 'e4a0a455ae184131b1873287e608ebda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.342 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.343 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.343 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5624674-d29f-48cd-8932-10fbbf5d2ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.343456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73203a52-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '8e227705f702c08bd5798219b17ce94fbad43e82c7a465ea13ffd7cb63b0232d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.343456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '732041e6-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.97860921, 'message_signature': '11158b1fd5059145e423a7ead1575f93d9e94189c959d25235f42c4c8cdf9f09'}]}, 'timestamp': '2025-11-24 02:01:11.343914', '_unique_id': '10c4f5cf6edd4c289a463d4ef1d53b03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.345 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.345 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8348abf-657b-4a79-bd82-29e994e60c45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-vda', 'timestamp': '2025-11-24T02:01:11.345195', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '73207fda-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': 'c46a11db2c45565feb7b085c8bd811bf8dd989e5be4dbf935938fe81be00f854'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-sda', 'timestamp': '2025-11-24T02:01:11.345195', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'instance-00000006', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '732088fe-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3138.01116919, 'message_signature': '7ff4a9c566502dde3976a9f63a79c8ac2e305e22a5417793e414bd202a27274f'}]}, 'timestamp': '2025-11-24 02:01:11.345680', '_unique_id': 'b7f6eb8c9a494023b061840f24e31b7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.346 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a9979ca-2d31-4b09-91cb-aefe7f6bbff8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.346866', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '7320c120-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '3d3800e861fd9e19d3bdac108fd3ca1a1a5137c61b07e24155cd5cf0275aeb47'}]}, 'timestamp': '2025-11-24 02:01:11.347170', '_unique_id': 'f855319a6f5e444c94f08676cfa0a64b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.348 12 DEBUG ceilometer.compute.pollsters [-] a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4dd12b0-2c14-425c-b474-750580ab7a8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-00000006-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-tap175bb896-4c', 'timestamp': '2025-11-24T02:01:11.348663', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-451741380', 'name': 'tap175bb896-4c', 'instance_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:60:76', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap175bb896-4c'}, 'message_id': '732105c2-c8d9-11f0-959b-fa163eb968c1', 'monotonic_time': 3137.969672697, 'message_signature': '1dea970afd315b605ec5beb440b3b424d9c30c0ad60706767e533d6f55363ab2'}]}, 'timestamp': '2025-11-24 02:01:11.348906', '_unique_id': 'b9ad9adca10848219a85a2c1f7f4643b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.349 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.350 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.350 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:01:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:01:11.350 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-451741380>]
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.351 187003 INFO nova.compute.manager [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Took 5.10 seconds to build instance.
Nov 24 02:01:11 compute-0 podman[215847]: 2025-11-24 02:01:11.353952745 +0000 UTC m=+0.054020898 container create 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.354 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.354 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.354 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.365 187003 DEBUG oslo_concurrency.lockutils [None req-5d1175db-2150-42c7-9227-0dba22a97420 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:11 compute-0 systemd[1]: Started libpod-conmon-5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33.scope.
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.410 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:01:11 compute-0 podman[215847]: 2025-11-24 02:01:11.325380277 +0000 UTC m=+0.025448460 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.425 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:01:11 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:01:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d61bf364c4657eb6f481207e93ddc77354eaeff1d7df758528e6d43a1dbfcbb4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:01:11 compute-0 podman[215847]: 2025-11-24 02:01:11.457549073 +0000 UTC m=+0.157617246 container init 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.457 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:01:11 compute-0 nova_compute[186999]: 2025-11-24 02:01:11.458 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:11 compute-0 podman[215847]: 2025-11-24 02:01:11.463226563 +0000 UTC m=+0.163294706 container start 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 02:01:11 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [NOTICE]   (215863) : New worker (215865) forked
Nov 24 02:01:11 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [NOTICE]   (215863) : Loading success.
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.187 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:12 compute-0 sshd-session[215806]: Invalid user bot from 154.90.59.75 port 48270
Nov 24 02:01:12 compute-0 podman[215874]: 2025-11-24 02:01:12.28851585 +0000 UTC m=+0.078238973 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 24 02:01:12 compute-0 sshd-session[215806]: Received disconnect from 154.90.59.75 port 48270:11: Bye Bye [preauth]
Nov 24 02:01:12 compute-0 sshd-session[215806]: Disconnected from invalid user bot 154.90.59.75 port 48270 [preauth]
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.454 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.454 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.454 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.454 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.617 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.617 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.617 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 02:01:12 compute-0 nova_compute[186999]: 2025-11-24 02:01:12.617 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.063 187003 DEBUG nova.compute.manager [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.063 187003 DEBUG oslo_concurrency.lockutils [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.064 187003 DEBUG oslo_concurrency.lockutils [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.064 187003 DEBUG oslo_concurrency.lockutils [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.064 187003 DEBUG nova.compute.manager [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.064 187003 WARNING nova.compute.manager [req-4c9817b0-8b45-4e20-8fd1-3dbd39eb1f93 req-ef69af56-cdcb-41d8-a377-0f1a3265ab80 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc for instance with vm_state active and task_state None.
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.598 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.626 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.626 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.627 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:13 compute-0 nova_compute[186999]: 2025-11-24 02:01:13.627 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:13 compute-0 podman[215895]: 2025-11-24 02:01:13.821788936 +0000 UTC m=+0.075257178 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:01:13 compute-0 podman[215896]: 2025-11-24 02:01:13.832022015 +0000 UTC m=+0.080775984 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 02:01:14 compute-0 ovn_controller[95380]: 2025-11-24T02:01:14Z|00085|binding|INFO|Releasing lport 3fd96984-65bc-4f5c-892b-b6485ade3b7a from this chassis (sb_readonly=0)
Nov 24 02:01:14 compute-0 NetworkManager[55458]: <info>  [1763949674.3034] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 24 02:01:14 compute-0 NetworkManager[55458]: <info>  [1763949674.3041] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 24 02:01:14 compute-0 nova_compute[186999]: 2025-11-24 02:01:14.304 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:14 compute-0 ovn_controller[95380]: 2025-11-24T02:01:14Z|00086|binding|INFO|Releasing lport 3fd96984-65bc-4f5c-892b-b6485ade3b7a from this chassis (sb_readonly=0)
Nov 24 02:01:14 compute-0 nova_compute[186999]: 2025-11-24 02:01:14.316 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:14 compute-0 nova_compute[186999]: 2025-11-24 02:01:14.320 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:14 compute-0 nova_compute[186999]: 2025-11-24 02:01:14.881 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:15 compute-0 nova_compute[186999]: 2025-11-24 02:01:15.192 187003 DEBUG nova.compute.manager [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:15 compute-0 nova_compute[186999]: 2025-11-24 02:01:15.193 187003 DEBUG nova.compute.manager [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing instance network info cache due to event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:01:15 compute-0 nova_compute[186999]: 2025-11-24 02:01:15.193 187003 DEBUG oslo_concurrency.lockutils [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:15 compute-0 nova_compute[186999]: 2025-11-24 02:01:15.194 187003 DEBUG oslo_concurrency.lockutils [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:15 compute-0 nova_compute[186999]: 2025-11-24 02:01:15.194 187003 DEBUG nova.network.neutron [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:01:16 compute-0 nova_compute[186999]: 2025-11-24 02:01:16.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:01:16 compute-0 nova_compute[186999]: 2025-11-24 02:01:16.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:01:17 compute-0 nova_compute[186999]: 2025-11-24 02:01:17.189 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:17 compute-0 nova_compute[186999]: 2025-11-24 02:01:17.848 187003 DEBUG nova.network.neutron [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated VIF entry in instance network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:01:17 compute-0 nova_compute[186999]: 2025-11-24 02:01:17.849 187003 DEBUG nova.network.neutron [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:17 compute-0 nova_compute[186999]: 2025-11-24 02:01:17.868 187003 DEBUG oslo_concurrency.lockutils [req-e249b0e6-ba4b-4d69-8fa2-83fe6559f6e7 req-f4bcbe5c-9078-425d-98a6-3c02027ababb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:19 compute-0 nova_compute[186999]: 2025-11-24 02:01:19.883 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:22 compute-0 nova_compute[186999]: 2025-11-24 02:01:22.189 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:23 compute-0 podman[215965]: 2025-11-24 02:01:23.818195203 +0000 UTC m=+0.057338571 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 24 02:01:24 compute-0 ovn_controller[95380]: 2025-11-24T02:01:24Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:60:76 10.100.0.11
Nov 24 02:01:24 compute-0 ovn_controller[95380]: 2025-11-24T02:01:24Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:60:76 10.100.0.11
Nov 24 02:01:24 compute-0 nova_compute[186999]: 2025-11-24 02:01:24.886 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:27 compute-0 nova_compute[186999]: 2025-11-24 02:01:27.192 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:27 compute-0 podman[215985]: 2025-11-24 02:01:27.835713754 +0000 UTC m=+0.081431962 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 24 02:01:29 compute-0 nova_compute[186999]: 2025-11-24 02:01:29.890 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:31 compute-0 nova_compute[186999]: 2025-11-24 02:01:31.579 187003 INFO nova.compute.manager [None req-5048075a-12b0-4115-964d-76ea9d20c06c e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Get console output
Nov 24 02:01:31 compute-0 nova_compute[186999]: 2025-11-24 02:01:31.588 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:01:32 compute-0 nova_compute[186999]: 2025-11-24 02:01:32.194 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:33 compute-0 podman[216006]: 2025-11-24 02:01:33.821200047 +0000 UTC m=+0.074629970 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.246 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.247 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.247 187003 DEBUG nova.objects.instance [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'flavor' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.627 187003 DEBUG nova.objects.instance [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_requests' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.641 187003 DEBUG nova.network.neutron [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.769 187003 DEBUG nova.policy [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:01:34 compute-0 nova_compute[186999]: 2025-11-24 02:01:34.892 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.176 187003 DEBUG nova.network.neutron [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Successfully created port: cb732a8a-275c-4d2f-8753-b13117c9e15b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.868 187003 DEBUG nova.network.neutron [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Successfully updated port: cb732a8a-275c-4d2f-8753-b13117c9e15b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.915 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.916 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.916 187003 DEBUG nova.network.neutron [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.945 187003 DEBUG nova.compute.manager [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-changed-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.946 187003 DEBUG nova.compute.manager [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing instance network info cache due to event network-changed-cb732a8a-275c-4d2f-8753-b13117c9e15b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:01:35 compute-0 nova_compute[186999]: 2025-11-24 02:01:35.946 187003 DEBUG oslo_concurrency.lockutils [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:37 compute-0 nova_compute[186999]: 2025-11-24 02:01:37.197 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:38 compute-0 podman[216030]: 2025-11-24 02:01:38.833025521 +0000 UTC m=+0.081033891 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:01:39 compute-0 nova_compute[186999]: 2025-11-24 02:01:39.893 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.626 187003 DEBUG nova.network.neutron [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.642 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.643 187003 DEBUG oslo_concurrency.lockutils [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.643 187003 DEBUG nova.network.neutron [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing network info cache for port cb732a8a-275c-4d2f-8753-b13117c9e15b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.645 187003 DEBUG nova.virt.libvirt.vif [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.645 187003 DEBUG nova.network.os_vif_util [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.646 187003 DEBUG nova.network.os_vif_util [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.646 187003 DEBUG os_vif [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.647 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.647 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.647 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.651 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.651 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcb732a8a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.652 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcb732a8a-27, col_values=(('external_ids', {'iface-id': 'cb732a8a-275c-4d2f-8753-b13117c9e15b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:e2:66', 'vm-uuid': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.699 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.7006] manager: (tapcb732a8a-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.703 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.709 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.711 187003 INFO os_vif [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27')
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.712 187003 DEBUG nova.virt.libvirt.vif [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.712 187003 DEBUG nova.network.os_vif_util [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.713 187003 DEBUG nova.network.os_vif_util [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.715 187003 DEBUG nova.virt.libvirt.guest [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] attach device xml: <interface type="ethernet">
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:32:e2:66"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <target dev="tapcb732a8a-27"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]: </interface>
Nov 24 02:01:40 compute-0 nova_compute[186999]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 24 02:01:40 compute-0 kernel: tapcb732a8a-27: entered promiscuous mode
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.7303] manager: (tapcb732a8a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.730 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 ovn_controller[95380]: 2025-11-24T02:01:40Z|00087|binding|INFO|Claiming lport cb732a8a-275c-4d2f-8753-b13117c9e15b for this chassis.
Nov 24 02:01:40 compute-0 ovn_controller[95380]: 2025-11-24T02:01:40Z|00088|binding|INFO|cb732a8a-275c-4d2f-8753-b13117c9e15b: Claiming fa:16:3e:32:e2:66 10.100.0.28
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.739 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:e2:66 10.100.0.28'], port_security=['fa:16:3e:32:e2:66 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=193f1837-8554-4329-9156-c41225728b70, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=cb732a8a-275c-4d2f-8753-b13117c9e15b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.740 104238 INFO neutron.agent.ovn.metadata.agent [-] Port cb732a8a-275c-4d2f-8753-b13117c9e15b in datapath 22d3e7a3-70c8-4703-93b8-9dc2614f45c5 bound to our chassis
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.741 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22d3e7a3-70c8-4703-93b8-9dc2614f45c5
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.750 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[68143b3a-48f1-4348-b964-6285feeb5412]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.751 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap22d3e7a3-71 in ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.759 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap22d3e7a3-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.760 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2796d85b-5d4b-4ea8-a93a-a840b6918c3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.760 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7ffe27-5486-45d3-bbeb-7e7c6fb0a2db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.770 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[e03476ab-b5af-423c-92fe-1229ba8ef008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 systemd-udevd[216057]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.790 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 ovn_controller[95380]: 2025-11-24T02:01:40Z|00089|binding|INFO|Setting lport cb732a8a-275c-4d2f-8753-b13117c9e15b ovn-installed in OVS
Nov 24 02:01:40 compute-0 ovn_controller[95380]: 2025-11-24T02:01:40Z|00090|binding|INFO|Setting lport cb732a8a-275c-4d2f-8753-b13117c9e15b up in Southbound
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.793 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.7976] device (tapcb732a8a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.7988] device (tapcb732a8a-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.797 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0025d364-ce9d-4dc3-bf51-de8471e7652e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.837 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4671bc-ec2d-4e36-8423-a5b0cfc5ac4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.846 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ed997e0d-5e6e-4408-ac98-3f840beacd94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.8474] manager: (tap22d3e7a3-70): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Nov 24 02:01:40 compute-0 systemd-udevd[216060]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.849 187003 DEBUG nova.virt.libvirt.driver [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.850 187003 DEBUG nova.virt.libvirt.driver [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.850 187003 DEBUG nova.virt.libvirt.driver [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:95:60:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.851 187003 DEBUG nova.virt.libvirt.driver [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:32:e2:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.876 187003 DEBUG nova.virt.libvirt.guest [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:01:40</nova:creationTime>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:01:40 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     <nova:port uuid="cb732a8a-275c-4d2f-8753-b13117c9e15b">
Nov 24 02:01:40 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Nov 24 02:01:40 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:01:40 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:01:40 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:01:40 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.897 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dfa872-0a38-404e-bc09-3c80bcfc832f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.901 187003 DEBUG oslo_concurrency.lockutils [None req-0610c6fd-b56b-4297-9375-c1063de6ce04 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.901 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cca945-ef28-4b05-b1e6-842baa3f24c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 NetworkManager[55458]: <info>  [1763949700.9306] device (tap22d3e7a3-70): carrier: link connected
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.937 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[1347cce4-1e98-4a08-a270-df77b3db0174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.954 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b28f3885-b0bb-45da-b3f2-28c674b84b44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3e7a3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:0d:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 316759, 'reachable_time': 18790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216082, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.969 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1766855e-4258-4ff5-94b7-0f689e454858]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:da7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 316759, 'tstamp': 316759}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216083, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:40.983 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[62fc9476-3346-4b97-9c6f-c7466f3aea51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3e7a3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:0d:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 316759, 'reachable_time': 18790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216084, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.996 187003 DEBUG nova.compute.manager [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.996 187003 DEBUG oslo_concurrency.lockutils [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.997 187003 DEBUG oslo_concurrency.lockutils [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.997 187003 DEBUG oslo_concurrency.lockutils [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.997 187003 DEBUG nova.compute.manager [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:01:40 compute-0 nova_compute[186999]: 2025-11-24 02:01:40.997 187003 WARNING nova.compute.manager [req-5de9fb8a-ab9d-4d9f-a101-8ec55b158843 req-8dc3e5b4-d56b-4c30-8b9b-022404494120 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b for instance with vm_state active and task_state None.
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.015 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[11f28a9d-cc3d-4c0a-ba05-2c67e79d99bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.079 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a51b8afd-bd9e-48e7-b717-6337c4419e23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.081 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3e7a3-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.081 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.081 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22d3e7a3-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:41 compute-0 NetworkManager[55458]: <info>  [1763949701.0842] manager: (tap22d3e7a3-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.083 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:41 compute-0 kernel: tap22d3e7a3-70: entered promiscuous mode
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.086 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22d3e7a3-70, col_values=(('external_ids', {'iface-id': 'e89687d7-ffeb-45d3-8e59-7c1296f94457'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.087 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:41 compute-0 ovn_controller[95380]: 2025-11-24T02:01:41Z|00091|binding|INFO|Releasing lport e89687d7-ffeb-45d3-8e59-7c1296f94457 from this chassis (sb_readonly=0)
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.089 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.090 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22d3e7a3-70c8-4703-93b8-9dc2614f45c5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22d3e7a3-70c8-4703-93b8-9dc2614f45c5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.091 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[274d8fd5-1060-4861-89d8-1157ccf968f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.092 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-22d3e7a3-70c8-4703-93b8-9dc2614f45c5
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/22d3e7a3-70c8-4703-93b8-9dc2614f45c5.pid.haproxy
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 22d3e7a3-70c8-4703-93b8-9dc2614f45c5
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:01:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:41.093 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'env', 'PROCESS_TAG=haproxy-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/22d3e7a3-70c8-4703-93b8-9dc2614f45c5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.100 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:41 compute-0 podman[216116]: 2025-11-24 02:01:41.52334015 +0000 UTC m=+0.072009406 container create cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 02:01:41 compute-0 systemd[1]: Started libpod-conmon-cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0.scope.
Nov 24 02:01:41 compute-0 podman[216116]: 2025-11-24 02:01:41.487570129 +0000 UTC m=+0.036239465 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:01:41 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:01:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e351f63db727cb059edb5a0fe208f83513959e9855f5f73096bd27c22cceb62c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:01:41 compute-0 podman[216116]: 2025-11-24 02:01:41.632696181 +0000 UTC m=+0.181365477 container init cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 02:01:41 compute-0 podman[216116]: 2025-11-24 02:01:41.639406971 +0000 UTC m=+0.188076217 container start cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:01:41 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [NOTICE]   (216135) : New worker (216137) forked
Nov 24 02:01:41 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [NOTICE]   (216135) : Loading success.
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.703 187003 DEBUG nova.network.neutron [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated VIF entry in instance network info cache for port cb732a8a-275c-4d2f-8753-b13117c9e15b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.704 187003 DEBUG nova.network.neutron [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:41 compute-0 nova_compute[186999]: 2025-11-24 02:01:41.717 187003 DEBUG oslo_concurrency.lockutils [req-4a2a0952-d0da-4eb5-a428-31185d336eab req-e0e638d3-c232-41aa-822e-cb21f45bad49 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:42 compute-0 nova_compute[186999]: 2025-11-24 02:01:42.198 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:42 compute-0 podman[216146]: 2025-11-24 02:01:42.838474331 +0000 UTC m=+0.078191661 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.083 187003 DEBUG nova.compute.manager [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.083 187003 DEBUG oslo_concurrency.lockutils [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.083 187003 DEBUG oslo_concurrency.lockutils [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.083 187003 DEBUG oslo_concurrency.lockutils [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.083 187003 DEBUG nova.compute.manager [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:01:43 compute-0 nova_compute[186999]: 2025-11-24 02:01:43.084 187003 WARNING nova.compute.manager [req-30b71c55-07ca-4018-9666-e94aaf5884c7 req-6e35d50d-2ea8-4c30-81ff-57d7e5b407ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b for instance with vm_state active and task_state None.
Nov 24 02:01:43 compute-0 ovn_controller[95380]: 2025-11-24T02:01:43Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:e2:66 10.100.0.28
Nov 24 02:01:43 compute-0 ovn_controller[95380]: 2025-11-24T02:01:43Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:e2:66 10.100.0.28
Nov 24 02:01:44 compute-0 podman[216167]: 2025-11-24 02:01:44.817169587 +0000 UTC m=+0.062246371 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 02:01:44 compute-0 podman[216168]: 2025-11-24 02:01:44.867352345 +0000 UTC m=+0.112426119 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 24 02:01:45 compute-0 nova_compute[186999]: 2025-11-24 02:01:45.700 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:46 compute-0 sshd-session[216216]: Received disconnect from 46.188.119.26 port 37126:11: Bye Bye [preauth]
Nov 24 02:01:46 compute-0 sshd-session[216216]: Disconnected from authenticating user root 46.188.119.26 port 37126 [preauth]
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.201 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.900 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.900 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.913 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.977 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.977 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.986 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:01:47 compute-0 nova_compute[186999]: 2025-11-24 02:01:47.986 187003 INFO nova.compute.claims [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.116 187003 DEBUG nova.compute.provider_tree [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.128 187003 DEBUG nova.scheduler.client.report [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.144 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.145 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.182 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.183 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.195 187003 INFO nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.210 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.301 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.303 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.303 187003 INFO nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Creating image(s)
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.304 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.304 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.305 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.317 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.392 187003 DEBUG nova.policy [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.407 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.407 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.408 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.419 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:48.423 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:48.424 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:48.425 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.486 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.489 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.528 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.529 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.529 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.586 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.587 187003 DEBUG nova.virt.disk.api [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.588 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.641 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.642 187003 DEBUG nova.virt.disk.api [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.643 187003 DEBUG nova.objects.instance [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.660 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.661 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Ensure instance console log exists: /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.661 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.661 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:48 compute-0 nova_compute[186999]: 2025-11-24 02:01:48.662 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:49.728 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:01:49 compute-0 nova_compute[186999]: 2025-11-24 02:01:49.729 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:49 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:49.730 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:01:50 compute-0 nova_compute[186999]: 2025-11-24 02:01:50.629 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Successfully created port: badcc031-788b-4cda-90e3-b41f6fc93109 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:01:50 compute-0 nova_compute[186999]: 2025-11-24 02:01:50.702 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.238 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.851 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Successfully updated port: badcc031-788b-4cda-90e3-b41f6fc93109 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.867 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.868 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.868 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.955 187003 DEBUG nova.compute.manager [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-changed-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.956 187003 DEBUG nova.compute.manager [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Refreshing instance network info cache due to event network-changed-badcc031-788b-4cda-90e3-b41f6fc93109. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:01:52 compute-0 nova_compute[186999]: 2025-11-24 02:01:52.957 187003 DEBUG oslo_concurrency.lockutils [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:01:53 compute-0 nova_compute[186999]: 2025-11-24 02:01:53.630 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.449 187003 DEBUG nova.network.neutron [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Updating instance_info_cache with network_info: [{"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.468 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.469 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Instance network_info: |[{"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.470 187003 DEBUG oslo_concurrency.lockutils [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.470 187003 DEBUG nova.network.neutron [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Refreshing network info cache for port badcc031-788b-4cda-90e3-b41f6fc93109 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.475 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Start _get_guest_xml network_info=[{"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.482 187003 WARNING nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.493 187003 DEBUG nova.virt.libvirt.host [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.494 187003 DEBUG nova.virt.libvirt.host [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.499 187003 DEBUG nova.virt.libvirt.host [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.500 187003 DEBUG nova.virt.libvirt.host [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.501 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.501 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.502 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.503 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.503 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.504 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.504 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.505 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.506 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.506 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.507 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.507 187003 DEBUG nova.virt.hardware [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.513 187003 DEBUG nova.virt.libvirt.vif [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:01:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1783562238',display_name='tempest-TestNetworkBasicOps-server-1783562238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1783562238',id=7,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5fpzlAwDPoGt8v/XlZpi74p7VSUQOqLISmVIi5CCuLUxKBgUR3C5UHLH3sYsZ1vWJLerfFrN9ni2AuVuLHu3B3mW5eUnBvG7q1EGCdrSHiFPtSmh275YKCTKXKCnsYdA==',key_name='tempest-TestNetworkBasicOps-794759167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-acvs7pxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:01:48Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=55a0d2d6-cb27-4e1e-8f22-5542afa59b1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.514 187003 DEBUG nova.network.os_vif_util [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.515 187003 DEBUG nova.network.os_vif_util [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.517 187003 DEBUG nova.objects.instance [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.531 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <uuid>55a0d2d6-cb27-4e1e-8f22-5542afa59b1a</uuid>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <name>instance-00000007</name>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-1783562238</nova:name>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:01:54</nova:creationTime>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         <nova:port uuid="badcc031-788b-4cda-90e3-b41f6fc93109">
Nov 24 02:01:54 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <system>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="serial">55a0d2d6-cb27-4e1e-8f22-5542afa59b1a</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="uuid">55a0d2d6-cb27-4e1e-8f22-5542afa59b1a</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </system>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <os>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </os>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <features>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </features>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.config"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:ce:a1:d8"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <target dev="tapbadcc031-78"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/console.log" append="off"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <video>
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </video>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:01:54 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:01:54 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:01:54 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:01:54 compute-0 nova_compute[186999]: </domain>
Nov 24 02:01:54 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.533 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Preparing to wait for external event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.533 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.534 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.534 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.536 187003 DEBUG nova.virt.libvirt.vif [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:01:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1783562238',display_name='tempest-TestNetworkBasicOps-server-1783562238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1783562238',id=7,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5fpzlAwDPoGt8v/XlZpi74p7VSUQOqLISmVIi5CCuLUxKBgUR3C5UHLH3sYsZ1vWJLerfFrN9ni2AuVuLHu3B3mW5eUnBvG7q1EGCdrSHiFPtSmh275YKCTKXKCnsYdA==',key_name='tempest-TestNetworkBasicOps-794759167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-acvs7pxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:01:48Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=55a0d2d6-cb27-4e1e-8f22-5542afa59b1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.536 187003 DEBUG nova.network.os_vif_util [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.538 187003 DEBUG nova.network.os_vif_util [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.538 187003 DEBUG os_vif [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.539 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.540 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.541 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.546 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.547 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbadcc031-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.548 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbadcc031-78, col_values=(('external_ids', {'iface-id': 'badcc031-788b-4cda-90e3-b41f6fc93109', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:a1:d8', 'vm-uuid': '55a0d2d6-cb27-4e1e-8f22-5542afa59b1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.550 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:54 compute-0 NetworkManager[55458]: <info>  [1763949714.5526] manager: (tapbadcc031-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.554 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.557 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.559 187003 INFO os_vif [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78')
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.607 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.608 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.608 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:ce:a1:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.609 187003 INFO nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Using config drive
Nov 24 02:01:54 compute-0 podman[216237]: 2025-11-24 02:01:54.682772446 +0000 UTC m=+0.081706170 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.934 187003 INFO nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Creating config drive at /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.config
Nov 24 02:01:54 compute-0 nova_compute[186999]: 2025-11-24 02:01:54.943 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqs0aj97w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.082 187003 DEBUG oslo_concurrency.processutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqs0aj97w" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:01:55 compute-0 kernel: tapbadcc031-78: entered promiscuous mode
Nov 24 02:01:55 compute-0 NetworkManager[55458]: <info>  [1763949715.1359] manager: (tapbadcc031-78): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 24 02:01:55 compute-0 ovn_controller[95380]: 2025-11-24T02:01:55Z|00092|binding|INFO|Claiming lport badcc031-788b-4cda-90e3-b41f6fc93109 for this chassis.
Nov 24 02:01:55 compute-0 ovn_controller[95380]: 2025-11-24T02:01:55Z|00093|binding|INFO|badcc031-788b-4cda-90e3-b41f6fc93109: Claiming fa:16:3e:ce:a1:d8 10.100.0.30
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.136 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.143 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a1:d8 10.100.0.30'], port_security=['fa:16:3e:ce:a1:d8 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '55a0d2d6-cb27-4e1e-8f22-5542afa59b1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07f296e8-3eb1-4d75-aee1-938f7135e892', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=193f1837-8554-4329-9156-c41225728b70, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=badcc031-788b-4cda-90e3-b41f6fc93109) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.145 104238 INFO neutron.agent.ovn.metadata.agent [-] Port badcc031-788b-4cda-90e3-b41f6fc93109 in datapath 22d3e7a3-70c8-4703-93b8-9dc2614f45c5 bound to our chassis
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.146 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22d3e7a3-70c8-4703-93b8-9dc2614f45c5
Nov 24 02:01:55 compute-0 ovn_controller[95380]: 2025-11-24T02:01:55Z|00094|binding|INFO|Setting lport badcc031-788b-4cda-90e3-b41f6fc93109 ovn-installed in OVS
Nov 24 02:01:55 compute-0 ovn_controller[95380]: 2025-11-24T02:01:55Z|00095|binding|INFO|Setting lport badcc031-788b-4cda-90e3-b41f6fc93109 up in Southbound
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.156 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.163 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.164 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a48799f5-7661-4b1c-ade3-a970bdfa805f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 systemd-udevd[216273]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:01:55 compute-0 NetworkManager[55458]: <info>  [1763949715.1799] device (tapbadcc031-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:01:55 compute-0 NetworkManager[55458]: <info>  [1763949715.1810] device (tapbadcc031-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:01:55 compute-0 systemd-machined[153319]: New machine qemu-7-instance-00000007.
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.200 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[652f99ef-f8ba-4394-ae9d-a1e9ddc9812d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.204 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcf3e73-1e63-4e20-815b-49c192272a6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.241 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9f7157-c666-4863-8f5e-e7d0b5b2aa80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.269 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3523b242-2303-4498-aa4b-fbd40ca8b486]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3e7a3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:0d:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 316759, 'reachable_time': 18790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216283, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.291 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[8a715aad-c46c-4ad8-9a74-56183716761f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22d3e7a3-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 316770, 'tstamp': 316770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216288, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap22d3e7a3-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 316773, 'tstamp': 316773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216288, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.293 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3e7a3-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.296 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.297 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.298 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22d3e7a3-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.298 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.299 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22d3e7a3-70, col_values=(('external_ids', {'iface-id': 'e89687d7-ffeb-45d3-8e59-7c1296f94457'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:55 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:55.299 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.343 187003 DEBUG nova.compute.manager [req-e67ed601-3d17-4ff4-8f9d-e7896ece2768 req-00611894-fcc4-4a53-95e5-0b3ae58e7657 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.343 187003 DEBUG oslo_concurrency.lockutils [req-e67ed601-3d17-4ff4-8f9d-e7896ece2768 req-00611894-fcc4-4a53-95e5-0b3ae58e7657 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.344 187003 DEBUG oslo_concurrency.lockutils [req-e67ed601-3d17-4ff4-8f9d-e7896ece2768 req-00611894-fcc4-4a53-95e5-0b3ae58e7657 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.344 187003 DEBUG oslo_concurrency.lockutils [req-e67ed601-3d17-4ff4-8f9d-e7896ece2768 req-00611894-fcc4-4a53-95e5-0b3ae58e7657 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.344 187003 DEBUG nova.compute.manager [req-e67ed601-3d17-4ff4-8f9d-e7896ece2768 req-00611894-fcc4-4a53-95e5-0b3ae58e7657 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Processing event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.633 187003 DEBUG nova.network.neutron [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Updated VIF entry in instance network info cache for port badcc031-788b-4cda-90e3-b41f6fc93109. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.634 187003 DEBUG nova.network.neutron [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Updating instance_info_cache with network_info: [{"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.646 187003 DEBUG oslo_concurrency.lockutils [req-3651b098-9389-4bd6-b2f2-094a24baa59e req-9885b7bf-dba3-421c-80b3-7a577216fc7a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.780 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.781 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949715.7801802, 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.782 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] VM Started (Lifecycle Event)
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.786 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.790 187003 INFO nova.virt.libvirt.driver [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Instance spawned successfully.
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.792 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.797 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.799 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.810 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.811 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.811 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.812 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.812 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.813 187003 DEBUG nova.virt.libvirt.driver [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.820 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.820 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949715.780953, 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.820 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] VM Paused (Lifecycle Event)
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.842 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.849 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949715.7859297, 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.849 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] VM Resumed (Lifecycle Event)
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.870 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.873 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.878 187003 INFO nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Took 7.58 seconds to spawn the instance on the hypervisor.
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.878 187003 DEBUG nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.901 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.933 187003 INFO nova.compute.manager [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Took 7.99 seconds to build instance.
Nov 24 02:01:55 compute-0 nova_compute[186999]: 2025-11-24 02:01:55.945 187003 DEBUG oslo_concurrency.lockutils [None req-5562951b-a24b-424c-9d28-93e49ec16857 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:01:56.732 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.240 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.416 187003 DEBUG nova.compute.manager [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.417 187003 DEBUG oslo_concurrency.lockutils [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.417 187003 DEBUG oslo_concurrency.lockutils [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.417 187003 DEBUG oslo_concurrency.lockutils [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.417 187003 DEBUG nova.compute.manager [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] No waiting events found dispatching network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:01:57 compute-0 nova_compute[186999]: 2025-11-24 02:01:57.418 187003 WARNING nova.compute.manager [req-20bd741f-2600-4107-84fd-902031cd2f48 req-43016345-43e3-42c8-86b2-e8c38303ea7d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received unexpected event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 for instance with vm_state active and task_state None.
Nov 24 02:01:58 compute-0 podman[216297]: 2025-11-24 02:01:58.8599427 +0000 UTC m=+0.095068189 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 02:01:59 compute-0 nova_compute[186999]: 2025-11-24 02:01:59.553 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:02 compute-0 nova_compute[186999]: 2025-11-24 02:02:02.274 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:04 compute-0 nova_compute[186999]: 2025-11-24 02:02:04.600 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:04 compute-0 nova_compute[186999]: 2025-11-24 02:02:04.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:04 compute-0 nova_compute[186999]: 2025-11-24 02:02:04.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 02:02:04 compute-0 podman[216318]: 2025-11-24 02:02:04.808871629 +0000 UTC m=+0.057894507 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.312 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:07 compute-0 ovn_controller[95380]: 2025-11-24T02:02:07Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:a1:d8 10.100.0.30
Nov 24 02:02:07 compute-0 ovn_controller[95380]: 2025-11-24T02:02:07Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:a1:d8 10.100.0.30
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.710 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.733 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Triggering sync for uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.734 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Triggering sync for uuid 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.734 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.735 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.736 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.736 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.781 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.782 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:07 compute-0 nova_compute[186999]: 2025-11-24 02:02:07.796 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:09 compute-0 nova_compute[186999]: 2025-11-24 02:02:09.602 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:09 compute-0 nova_compute[186999]: 2025-11-24 02:02:09.768 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:09 compute-0 podman[216360]: 2025-11-24 02:02:09.79617078 +0000 UTC m=+0.051324112 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 24 02:02:11 compute-0 nova_compute[186999]: 2025-11-24 02:02:11.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:11 compute-0 nova_compute[186999]: 2025-11-24 02:02:11.770 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:02:11 compute-0 nova_compute[186999]: 2025-11-24 02:02:11.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:02:12 compute-0 nova_compute[186999]: 2025-11-24 02:02:12.369 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:12 compute-0 nova_compute[186999]: 2025-11-24 02:02:12.618 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:02:12 compute-0 nova_compute[186999]: 2025-11-24 02:02:12.618 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:02:12 compute-0 nova_compute[186999]: 2025-11-24 02:02:12.618 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 02:02:12 compute-0 nova_compute[186999]: 2025-11-24 02:02:12.618 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:13 compute-0 podman[216379]: 2025-11-24 02:02:13.829120427 +0000 UTC m=+0.069391292 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:02:14 compute-0 nova_compute[186999]: 2025-11-24 02:02:14.642 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.037 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.057 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.058 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.059 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.059 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.059 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.060 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.060 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.083 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.083 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.084 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.084 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.169 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:02:15 compute-0 podman[216401]: 2025-11-24 02:02:15.224071155 +0000 UTC m=+0.089859601 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:02:15 compute-0 podman[216402]: 2025-11-24 02:02:15.251253463 +0000 UTC m=+0.111607975 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.258 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.259 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.329 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.338 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.397 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.398 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.447 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.595 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.596 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5436MB free_disk=73.40185546875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.596 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.597 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.782 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.782 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.782 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.782 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.895 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing inventories for resource provider f28f14d1-2972-450a-b67e-0899e7918234 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.977 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating ProviderTree inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.978 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 02:02:15 compute-0 nova_compute[186999]: 2025-11-24 02:02:15.991 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing aggregate associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.014 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing trait associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, traits: COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AESNI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,HW_CPU_X86_ABM,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.072 187003 DEBUG nova.compute.manager [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-changed-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.073 187003 DEBUG nova.compute.manager [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing instance network info cache due to event network-changed-cb732a8a-275c-4d2f-8753-b13117c9e15b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.073 187003 DEBUG oslo_concurrency.lockutils [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.073 187003 DEBUG oslo_concurrency.lockutils [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.074 187003 DEBUG nova.network.neutron [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing network info cache for port cb732a8a-275c-4d2f-8753-b13117c9e15b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.080 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.093 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.115 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.115 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.518s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.116 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.781 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.781 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.782 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 02:02:16 compute-0 nova_compute[186999]: 2025-11-24 02:02:16.795 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.053 187003 DEBUG nova.network.neutron [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated VIF entry in instance network info cache for port cb732a8a-275c-4d2f-8753-b13117c9e15b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.053 187003 DEBUG nova.network.neutron [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.069 187003 DEBUG oslo_concurrency.lockutils [req-2f85b80b-839b-41eb-bea2-d1b67e62dbd5 req-14dbf3d2-15df-4035-9ee2-af12942c6cb6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.371 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.784 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:02:17 compute-0 nova_compute[186999]: 2025-11-24 02:02:17.785 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:02:19 compute-0 nova_compute[186999]: 2025-11-24 02:02:19.646 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:22 compute-0 nova_compute[186999]: 2025-11-24 02:02:22.374 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:24 compute-0 nova_compute[186999]: 2025-11-24 02:02:24.648 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:24 compute-0 podman[216461]: 2025-11-24 02:02:24.854503118 +0000 UTC m=+0.097273200 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 02:02:26 compute-0 sshd-session[216483]: Received disconnect from 154.90.59.75 port 42418:11: Bye Bye [preauth]
Nov 24 02:02:26 compute-0 sshd-session[216483]: Disconnected from authenticating user root 154.90.59.75 port 42418 [preauth]
Nov 24 02:02:27 compute-0 nova_compute[186999]: 2025-11-24 02:02:27.378 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:29 compute-0 nova_compute[186999]: 2025-11-24 02:02:29.692 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:29 compute-0 podman[216485]: 2025-11-24 02:02:29.853929272 +0000 UTC m=+0.089018647 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 02:02:32 compute-0 nova_compute[186999]: 2025-11-24 02:02:32.424 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:34 compute-0 nova_compute[186999]: 2025-11-24 02:02:34.696 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:35 compute-0 podman[216507]: 2025-11-24 02:02:35.814610955 +0000 UTC m=+0.059397539 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 02:02:37 compute-0 nova_compute[186999]: 2025-11-24 02:02:37.427 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:39 compute-0 nova_compute[186999]: 2025-11-24 02:02:39.700 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:40 compute-0 podman[216533]: 2025-11-24 02:02:40.819282566 +0000 UTC m=+0.068080845 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Nov 24 02:02:42 compute-0 nova_compute[186999]: 2025-11-24 02:02:42.430 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.862 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.863 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.863 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.864 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.865 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.868 187003 INFO nova.compute.manager [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Terminating instance
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.870 187003 DEBUG nova.compute.manager [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:02:43 compute-0 kernel: tapbadcc031-78 (unregistering): left promiscuous mode
Nov 24 02:02:43 compute-0 NetworkManager[55458]: <info>  [1763949763.9044] device (tapbadcc031-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:02:43 compute-0 ovn_controller[95380]: 2025-11-24T02:02:43Z|00096|binding|INFO|Releasing lport badcc031-788b-4cda-90e3-b41f6fc93109 from this chassis (sb_readonly=0)
Nov 24 02:02:43 compute-0 ovn_controller[95380]: 2025-11-24T02:02:43Z|00097|binding|INFO|Setting lport badcc031-788b-4cda-90e3-b41f6fc93109 down in Southbound
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.931 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:43 compute-0 ovn_controller[95380]: 2025-11-24T02:02:43Z|00098|binding|INFO|Removing iface tapbadcc031-78 ovn-installed in OVS
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.936 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:43.942 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a1:d8 10.100.0.30'], port_security=['fa:16:3e:ce:a1:d8 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '55a0d2d6-cb27-4e1e-8f22-5542afa59b1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07f296e8-3eb1-4d75-aee1-938f7135e892', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=193f1837-8554-4329-9156-c41225728b70, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=badcc031-788b-4cda-90e3-b41f6fc93109) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:02:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:43.946 104238 INFO neutron.agent.ovn.metadata.agent [-] Port badcc031-788b-4cda-90e3-b41f6fc93109 in datapath 22d3e7a3-70c8-4703-93b8-9dc2614f45c5 unbound from our chassis
Nov 24 02:02:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:43.947 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22d3e7a3-70c8-4703-93b8-9dc2614f45c5
Nov 24 02:02:43 compute-0 nova_compute[186999]: 2025-11-24 02:02:43.961 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:43 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:43.974 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0b5f69-47bb-45c9-8d35-7c5f7f85824b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:43 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 24 02:02:43 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.007s CPU time.
Nov 24 02:02:44 compute-0 systemd-machined[153319]: Machine qemu-7-instance-00000007 terminated.
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.008 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb7713c-ed46-4b6a-8932-cffe1a4c4542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.012 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[898e929f-18cd-45b2-8ba9-490c63acf7d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:44 compute-0 podman[216552]: 2025-11-24 02:02:44.021194475 +0000 UTC m=+0.083025877 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.039 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdc940b-8bda-4ab7-b1dd-685062b6581c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.059 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4294a2ca-90b3-47fc-8038-d29b36a900a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3e7a3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:0d:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 316759, 'reachable_time': 35694, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216584, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.081 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[db24e22d-549f-43dc-a4d0-8e3de76bc982]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22d3e7a3-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 316770, 'tstamp': 316770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216585, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap22d3e7a3-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 316773, 'tstamp': 316773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216585, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.084 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3e7a3-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.086 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.091 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.091 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22d3e7a3-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.092 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.092 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22d3e7a3-70, col_values=(('external_ids', {'iface-id': 'e89687d7-ffeb-45d3-8e59-7c1296f94457'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.092 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.096 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.101 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.134 187003 INFO nova.virt.libvirt.driver [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Instance destroyed successfully.
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.135 187003 DEBUG nova.objects.instance [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.146 187003 DEBUG nova.virt.libvirt.vif [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1783562238',display_name='tempest-TestNetworkBasicOps-server-1783562238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1783562238',id=7,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5fpzlAwDPoGt8v/XlZpi74p7VSUQOqLISmVIi5CCuLUxKBgUR3C5UHLH3sYsZ1vWJLerfFrN9ni2AuVuLHu3B3mW5eUnBvG7q1EGCdrSHiFPtSmh275YKCTKXKCnsYdA==',key_name='tempest-TestNetworkBasicOps-794759167',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-acvs7pxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:55Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=55a0d2d6-cb27-4e1e-8f22-5542afa59b1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.147 187003 DEBUG nova.network.os_vif_util [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "badcc031-788b-4cda-90e3-b41f6fc93109", "address": "fa:16:3e:ce:a1:d8", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbadcc031-78", "ovs_interfaceid": "badcc031-788b-4cda-90e3-b41f6fc93109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.148 187003 DEBUG nova.network.os_vif_util [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.148 187003 DEBUG os_vif [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.150 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.150 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbadcc031-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.152 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.154 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.154 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.158 187003 INFO os_vif [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a1:d8,bridge_name='br-int',has_traffic_filtering=True,id=badcc031-788b-4cda-90e3-b41f6fc93109,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbadcc031-78')
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.159 187003 INFO nova.virt.libvirt.driver [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Deleting instance files /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a_del
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.159 187003 INFO nova.virt.libvirt.driver [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Deletion of /var/lib/nova/instances/55a0d2d6-cb27-4e1e-8f22-5542afa59b1a_del complete
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.211 187003 INFO nova.compute.manager [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.212 187003 DEBUG oslo.service.loopingcall [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.212 187003 DEBUG nova.compute.manager [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.212 187003 DEBUG nova.network.neutron [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.285 187003 DEBUG nova.compute.manager [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-unplugged-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.286 187003 DEBUG oslo_concurrency.lockutils [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.286 187003 DEBUG oslo_concurrency.lockutils [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.286 187003 DEBUG oslo_concurrency.lockutils [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.287 187003 DEBUG nova.compute.manager [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] No waiting events found dispatching network-vif-unplugged-badcc031-788b-4cda-90e3-b41f6fc93109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.287 187003 DEBUG nova.compute.manager [req-63642e14-8fe3-406f-80ec-192cfc6712ef req-dfe6d5ea-ef0e-4039-9c30-910912b687e8 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-unplugged-badcc031-788b-4cda-90e3-b41f6fc93109 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 02:02:44 compute-0 sshd-session[216506]: Connection closed by authenticating user root 68.210.96.117 port 38176 [preauth]
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.900 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:02:44 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:44.901 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.914 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.925 187003 DEBUG nova.network.neutron [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.941 187003 INFO nova.compute.manager [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Took 0.73 seconds to deallocate network for instance.
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.994 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:44 compute-0 nova_compute[186999]: 2025-11-24 02:02:44.994 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.004 187003 DEBUG nova.compute.manager [req-0cb6aa71-6687-45f5-9f2c-35de6ee2fe7b req-71bee5bf-6f8e-45b0-8b6e-be0f6663cc34 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-deleted-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.069 187003 DEBUG nova.compute.provider_tree [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.080 187003 DEBUG nova.scheduler.client.report [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.097 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.121 187003 INFO nova.scheduler.client.report [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.170 187003 DEBUG oslo_concurrency.lockutils [None req-394c326c-34e5-4905-8c9f-4a8d2dad73ff e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:45 compute-0 podman[216603]: 2025-11-24 02:02:45.840508566 +0000 UTC m=+0.081243487 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:02:45 compute-0 podman[216604]: 2025-11-24 02:02:45.888719658 +0000 UTC m=+0.126658970 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.926 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-cb732a8a-275c-4d2f-8753-b13117c9e15b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.927 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-cb732a8a-275c-4d2f-8753-b13117c9e15b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.941 187003 DEBUG nova.objects.instance [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'flavor' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.961 187003 DEBUG nova.virt.libvirt.vif [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.961 187003 DEBUG nova.network.os_vif_util [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.962 187003 DEBUG nova.network.os_vif_util [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.966 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.968 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.971 187003 DEBUG nova.virt.libvirt.driver [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Attempting to detach device tapcb732a8a-27 from instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.972 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] detach device xml: <interface type="ethernet">
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:32:e2:66"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <target dev="tapcb732a8a-27"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]: </interface>
Nov 24 02:02:45 compute-0 nova_compute[186999]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.980 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.983 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <name>instance-00000006</name>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <uuid>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</uuid>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:01:40</nova:creationTime>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <nova:port uuid="cb732a8a-275c-4d2f-8753-b13117c9e15b">
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:45 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <resource>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </resource>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <system>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='serial'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='uuid'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </system>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <os>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </os>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <features>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </features>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk' index='2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config' index='1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:95:60:76'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target dev='tap175bb896-4c'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:32:e2:66'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target dev='tapcb732a8a-27'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='net1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       </target>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </console>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </graphics>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <video>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </video>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c297,c621</label>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c297,c621</imagelabel>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 02:02:45 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:45 compute-0 nova_compute[186999]: </domain>
Nov 24 02:02:45 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.984 187003 INFO nova.virt.libvirt.driver [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully detached device tapcb732a8a-27 from instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 from the persistent domain config.
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.985 187003 DEBUG nova.virt.libvirt.driver [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] (1/8): Attempting to detach device tapcb732a8a-27 with device alias net1 from instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 24 02:02:45 compute-0 nova_compute[186999]: 2025-11-24 02:02:45.985 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] detach device xml: <interface type="ethernet">
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <mac address="fa:16:3e:32:e2:66"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <model type="virtio"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <mtu size="1442"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]:   <target dev="tapcb732a8a-27"/>
Nov 24 02:02:45 compute-0 nova_compute[186999]: </interface>
Nov 24 02:02:45 compute-0 nova_compute[186999]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 02:02:46 compute-0 kernel: tapcb732a8a-27 (unregistering): left promiscuous mode
Nov 24 02:02:46 compute-0 NetworkManager[55458]: <info>  [1763949766.0921] device (tapcb732a8a-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:02:46 compute-0 ovn_controller[95380]: 2025-11-24T02:02:46Z|00099|binding|INFO|Releasing lport cb732a8a-275c-4d2f-8753-b13117c9e15b from this chassis (sb_readonly=0)
Nov 24 02:02:46 compute-0 ovn_controller[95380]: 2025-11-24T02:02:46Z|00100|binding|INFO|Setting lport cb732a8a-275c-4d2f-8753-b13117c9e15b down in Southbound
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.133 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 ovn_controller[95380]: 2025-11-24T02:02:46Z|00101|binding|INFO|Removing iface tapcb732a8a-27 ovn-installed in OVS
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.135 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.141 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:e2:66 10.100.0.28', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=193f1837-8554-4329-9156-c41225728b70, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=cb732a8a-275c-4d2f-8753-b13117c9e15b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.142 104238 INFO neutron.agent.ovn.metadata.agent [-] Port cb732a8a-275c-4d2f-8753-b13117c9e15b in datapath 22d3e7a3-70c8-4703-93b8-9dc2614f45c5 unbound from our chassis
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.143 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22d3e7a3-70c8-4703-93b8-9dc2614f45c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.145 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.146 187003 DEBUG nova.virt.libvirt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Received event <DeviceRemovedEvent: 1763949766.1465683, a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.145 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[732fce0a-5f89-4242-b5e0-fa329a9f885b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.147 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5 namespace which is not needed anymore
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.148 187003 DEBUG nova.virt.libvirt.driver [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Start waiting for the detach event from libvirt for device tapcb732a8a-27 with device alias net1 for instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.148 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.151 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <name>instance-00000006</name>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <uuid>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</uuid>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:01:40</nova:creationTime>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:port uuid="cb732a8a-275c-4d2f-8753-b13117c9e15b">
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:46 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <resource>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </resource>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <system>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='serial'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='uuid'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </system>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <os>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </os>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <features>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </features>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk' index='2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config' index='1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:95:60:76'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target dev='tap175bb896-4c'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       </target>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </console>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </graphics>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <video>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </video>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c297,c621</label>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c297,c621</imagelabel>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:46 compute-0 nova_compute[186999]: </domain>
Nov 24 02:02:46 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.151 187003 INFO nova.virt.libvirt.driver [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully detached device tapcb732a8a-27 from instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 from the live domain config.
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.152 187003 DEBUG nova.virt.libvirt.vif [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.152 187003 DEBUG nova.network.os_vif_util [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.153 187003 DEBUG nova.network.os_vif_util [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.153 187003 DEBUG os_vif [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.154 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.154 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcb732a8a-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.155 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.157 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.159 187003 INFO os_vif [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27')
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.159 187003 DEBUG nova.virt.libvirt.guest [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:02:46</nova:creationTime>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:46 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:46 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:46 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:46 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:46 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [NOTICE]   (216135) : haproxy version is 2.8.14-c23fe91
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [NOTICE]   (216135) : path to executable is /usr/sbin/haproxy
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [WARNING]  (216135) : Exiting Master process...
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [WARNING]  (216135) : Exiting Master process...
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [ALERT]    (216135) : Current worker (216137) exited with code 143 (Terminated)
Nov 24 02:02:46 compute-0 neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5[216131]: [WARNING]  (216135) : All workers exited. Exiting... (0)
Nov 24 02:02:46 compute-0 systemd[1]: libpod-cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0.scope: Deactivated successfully.
Nov 24 02:02:46 compute-0 podman[216678]: 2025-11-24 02:02:46.275442249 +0000 UTC m=+0.041491714 container died cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 02:02:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0-userdata-shm.mount: Deactivated successfully.
Nov 24 02:02:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-e351f63db727cb059edb5a0fe208f83513959e9855f5f73096bd27c22cceb62c-merged.mount: Deactivated successfully.
Nov 24 02:02:46 compute-0 podman[216678]: 2025-11-24 02:02:46.310653834 +0000 UTC m=+0.076703299 container cleanup cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:02:46 compute-0 systemd[1]: libpod-conmon-cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0.scope: Deactivated successfully.
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 DEBUG nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "55a0d2d6-cb27-4e1e-8f22-5542afa59b1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 DEBUG nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] No waiting events found dispatching network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.351 187003 WARNING nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Received unexpected event network-vif-plugged-badcc031-788b-4cda-90e3-b41f6fc93109 for instance with vm_state deleted and task_state None.
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 DEBUG nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-unplugged-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 DEBUG oslo_concurrency.lockutils [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 DEBUG nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-unplugged-cb732a8a-275c-4d2f-8753-b13117c9e15b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.352 187003 WARNING nova.compute.manager [req-80c23634-9ad4-405d-bba6-145a0dfc12d9 req-ca72063e-cb23-4ae4-b3e0-aa2c0ab45ee1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-unplugged-cb732a8a-275c-4d2f-8753-b13117c9e15b for instance with vm_state active and task_state None.
Nov 24 02:02:46 compute-0 podman[216710]: 2025-11-24 02:02:46.380467827 +0000 UTC m=+0.044990072 container remove cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.385 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ec45359a-e8cf-4a2e-bfda-0965ed8714af]: (4, ('Mon Nov 24 02:02:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5 (cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0)\ncf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0\nMon Nov 24 02:02:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5 (cf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0)\ncf179fad2c7d4d6cc74bb4de62710d53fe828dabaf07dac283c6a4fd2f56c6a0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.386 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d21cc856-8739-43bf-bc78-392c26ecc7a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.387 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3e7a3-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:46 compute-0 kernel: tap22d3e7a3-70: left promiscuous mode
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.389 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.401 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.404 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[55709347-ebaf-47f7-829f-9d830e76b2ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.424 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f1156b00-61c2-40f7-a665-172c6ba546b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.425 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9b489fcb-9331-41b2-ad7b-4bd3164756ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.442 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3d561b-eb48-43b3-9caa-75ec5065705e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 316749, 'reachable_time': 22617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216725, 'error': None, 'target': 'ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d22d3e7a3\x2d70c8\x2d4703\x2d93b8\x2d9dc2614f45c5.mount: Deactivated successfully.
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.446 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-22d3e7a3-70c8-4703-93b8-9dc2614f45c5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:02:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:46.447 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a9e1e4-0d2d-4092-92d8-f1324cb82d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.661 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.661 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:02:46 compute-0 nova_compute[186999]: 2025-11-24 02:02:46.662 187003 DEBUG nova.network.neutron [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.095 187003 DEBUG nova.compute.manager [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-deleted-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.096 187003 INFO nova.compute.manager [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Neutron deleted interface cb732a8a-275c-4d2f-8753-b13117c9e15b; detaching it from the instance and deleting it from the info cache
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.097 187003 DEBUG nova.network.neutron [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.118 187003 DEBUG nova.objects.instance [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lazy-loading 'system_metadata' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.140 187003 DEBUG nova.objects.instance [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lazy-loading 'flavor' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.154 187003 DEBUG nova.virt.libvirt.vif [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.155 187003 DEBUG nova.network.os_vif_util [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.156 187003 DEBUG nova.network.os_vif_util [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.159 187003 DEBUG nova.virt.libvirt.guest [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.163 187003 DEBUG nova.virt.libvirt.guest [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <name>instance-00000006</name>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <uuid>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</uuid>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:02:46</nova:creationTime>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <resource>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </resource>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <system>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='serial'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='uuid'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </system>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <os>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </os>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <features>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </features>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk' index='2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config' index='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:95:60:76'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='tap175bb896-4c'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       </target>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </console>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </graphics>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <video>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </video>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c297,c621</label>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c297,c621</imagelabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]: </domain>
Nov 24 02:02:47 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.163 187003 DEBUG nova.virt.libvirt.guest [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.167 187003 DEBUG nova.virt.libvirt.guest [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:32:e2:66"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcb732a8a-27"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <name>instance-00000006</name>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <uuid>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</uuid>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:02:46</nova:creationTime>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <memory unit='KiB'>131072</memory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <vcpu placement='static'>1</vcpu>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <resource>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <partition>/machine</partition>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </resource>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <sysinfo type='smbios'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <system>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='manufacturer'>RDO</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='serial'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='uuid'>a2e5c148-2c17-4f54-a3d6-b5655b0e87f1</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <entry name='family'>Virtual Machine</entry>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </system>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <os>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <boot dev='hd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <smbios mode='sysinfo'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </os>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <features>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <vmcoreinfo state='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </features>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <vendor>AMD</vendor>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='x2apic'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='hypervisor'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='stibp'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='overflow-recov'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='succor'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='ibrs'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='lbrv'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='pause-filter'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='xsaves'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='svm'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='require' name='topoext'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='npt'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <feature policy='disable' name='nrip-save'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <clock offset='utc'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <timer name='hpet' present='no'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_poweroff>destroy</on_poweroff>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_reboot>restart</on_reboot>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <on_crash>destroy</on_crash>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <disk type='file' device='disk'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk' index='2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backingStore type='file' index='3'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <format type='raw'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <source file='/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <backingStore/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       </backingStore>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='vda' bus='virtio'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='virtio-disk0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <disk type='file' device='cdrom'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/disk.config' index='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backingStore/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='sda' bus='sata'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <readonly/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='sata0-0-0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pcie.0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='1' port='0x10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='2' port='0x11'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='3' port='0x12'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='4' port='0x13'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='5' port='0x14'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='6' port='0x15'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='7' port='0x16'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='8' port='0x17'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.8'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='9' port='0x18'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.9'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='10' port='0x19'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='11' port='0x1a'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.11'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='12' port='0x1b'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.12'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='13' port='0x1c'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.13'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='14' port='0x1d'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.14'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='15' port='0x1e'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.15'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='16' port='0x1f'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.16'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='17' port='0x20'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.17'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='18' port='0x21'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.18'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='19' port='0x22'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.19'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='20' port='0x23'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.20'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='21' port='0x24'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.21'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='22' port='0x25'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.22'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='23' port='0x26'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.23'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='24' port='0x27'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.24'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-root-port'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target chassis='25' port='0x28'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.25'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model name='pcie-pci-bridge'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='pci.26'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='usb'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <controller type='sata' index='0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='ide'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </controller>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <interface type='ethernet'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <mac address='fa:16:3e:95:60:76'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target dev='tap175bb896-4c'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model type='virtio'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <mtu size='1442'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='net0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <serial type='pty'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target type='isa-serial' port='0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:         <model name='isa-serial'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       </target>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <source path='/dev/pts/0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <log file='/var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1/console.log' append='off'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <target type='serial' port='0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='serial0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </console>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='tablet' bus='usb'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='usb' bus='0' port='1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='mouse' bus='ps2'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input1'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <input type='keyboard' bus='ps2'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='input2'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </input>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <listen type='address' address='::0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </graphics>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <audio id='1' type='none'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <video>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='video0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </video>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <watchdog model='itco' action='reset'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='watchdog0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </watchdog>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <memballoon model='virtio'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <stats period='10'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='balloon0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <rng model='virtio'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <backend model='random'>/dev/urandom</backend>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <alias name='rng0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <label>system_u:system_r:svirt_t:s0:c297,c621</label>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c297,c621</imagelabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <label>+107:+107</label>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <imagelabel>+107:+107</imagelabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </seclabel>
Nov 24 02:02:47 compute-0 nova_compute[186999]: </domain>
Nov 24 02:02:47 compute-0 nova_compute[186999]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.168 187003 WARNING nova.virt.libvirt.driver [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Detaching interface fa:16:3e:32:e2:66 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapcb732a8a-27' not found.
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.169 187003 DEBUG nova.virt.libvirt.vif [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.169 187003 DEBUG nova.network.os_vif_util [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converting VIF {"id": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "address": "fa:16:3e:32:e2:66", "network": {"id": "22d3e7a3-70c8-4703-93b8-9dc2614f45c5", "bridge": "br-int", "label": "tempest-network-smoke--1948937097", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb732a8a-27", "ovs_interfaceid": "cb732a8a-275c-4d2f-8753-b13117c9e15b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.170 187003 DEBUG nova.network.os_vif_util [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.170 187003 DEBUG os_vif [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.172 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.173 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcb732a8a-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.173 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.175 187003 INFO os_vif [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:e2:66,bridge_name='br-int',has_traffic_filtering=True,id=cb732a8a-275c-4d2f-8753-b13117c9e15b,network=Network(22d3e7a3-70c8-4703-93b8-9dc2614f45c5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb732a8a-27')
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.176 187003 DEBUG nova.virt.libvirt.guest [req-4553c661-1f50-40a8-afcc-79b366127533 req-bcd30ddb-b8a6-4954-92a9-a810c2bc4149 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:name>tempest-TestNetworkBasicOps-server-451741380</nova:name>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:creationTime>2025-11-24 02:02:47</nova:creationTime>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:flavor name="m1.nano">
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:memory>128</nova:memory>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:disk>1</nova:disk>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:swap>0</nova:swap>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:vcpus>1</nova:vcpus>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:flavor>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:owner>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   <nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     <nova:port uuid="175bb896-4ccd-40b1-8746-160b190ce3fc">
Nov 24 02:02:47 compute-0 nova_compute[186999]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:02:47 compute-0 nova_compute[186999]:     </nova:port>
Nov 24 02:02:47 compute-0 nova_compute[186999]:   </nova:ports>
Nov 24 02:02:47 compute-0 nova_compute[186999]: </nova:instance>
Nov 24 02:02:47 compute-0 nova_compute[186999]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.432 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.566 187003 INFO nova.network.neutron [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Port cb732a8a-275c-4d2f-8753-b13117c9e15b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.567 187003 DEBUG nova.network.neutron [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.587 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:02:47 compute-0 ovn_controller[95380]: 2025-11-24T02:02:47Z|00102|binding|INFO|Releasing lport 3fd96984-65bc-4f5c-892b-b6485ade3b7a from this chassis (sb_readonly=0)
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.618 187003 DEBUG oslo_concurrency.lockutils [None req-2dc417d1-d421-4a52-b44a-4e0a29a4b80b e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "interface-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-cb732a8a-275c-4d2f-8753-b13117c9e15b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 1.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:47 compute-0 nova_compute[186999]: 2025-11-24 02:02:47.635 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:48.424 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:48.424 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:48.425 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.432 187003 DEBUG nova.compute.manager [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.433 187003 DEBUG oslo_concurrency.lockutils [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.434 187003 DEBUG oslo_concurrency.lockutils [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.434 187003 DEBUG oslo_concurrency.lockutils [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.434 187003 DEBUG nova.compute.manager [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:48 compute-0 nova_compute[186999]: 2025-11-24 02:02:48.435 187003 WARNING nova.compute.manager [req-3b70ae99-4f98-4c50-850c-12cfd4ae7a5e req-cf8b224c-2965-41b7-9365-5aaee3312892 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-plugged-cb732a8a-275c-4d2f-8753-b13117c9e15b for instance with vm_state active and task_state None.
Nov 24 02:02:51 compute-0 nova_compute[186999]: 2025-11-24 02:02:51.191 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:51 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:51.903 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.466 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.883 187003 DEBUG nova.compute.manager [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.883 187003 DEBUG nova.compute.manager [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing instance network info cache due to event network-changed-175bb896-4ccd-40b1-8746-160b190ce3fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.884 187003 DEBUG oslo_concurrency.lockutils [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.884 187003 DEBUG oslo_concurrency.lockutils [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.884 187003 DEBUG nova.network.neutron [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Refreshing network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.949 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.950 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.950 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.950 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.950 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.951 187003 INFO nova.compute.manager [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Terminating instance
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.952 187003 DEBUG nova.compute.manager [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:02:52 compute-0 kernel: tap175bb896-4c (unregistering): left promiscuous mode
Nov 24 02:02:52 compute-0 NetworkManager[55458]: <info>  [1763949772.9761] device (tap175bb896-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:02:52 compute-0 ovn_controller[95380]: 2025-11-24T02:02:52Z|00103|binding|INFO|Releasing lport 175bb896-4ccd-40b1-8746-160b190ce3fc from this chassis (sb_readonly=0)
Nov 24 02:02:52 compute-0 ovn_controller[95380]: 2025-11-24T02:02:52Z|00104|binding|INFO|Setting lport 175bb896-4ccd-40b1-8746-160b190ce3fc down in Southbound
Nov 24 02:02:52 compute-0 ovn_controller[95380]: 2025-11-24T02:02:52Z|00105|binding|INFO|Removing iface tap175bb896-4c ovn-installed in OVS
Nov 24 02:02:52 compute-0 nova_compute[186999]: 2025-11-24 02:02:52.986 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:52 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:52.996 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:60:76 10.100.0.11'], port_security=['fa:16:3e:95:60:76 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a2e5c148-2c17-4f54-a3d6-b5655b0e87f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3a228f-1352-43c0-b602-704afca624c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1d665d3-744d-426a-8fc5-4bea51a25946', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97327433-796a-4849-8d2d-30cf53b4e27b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=175bb896-4ccd-40b1-8746-160b190ce3fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:02:52 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:52.999 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 175bb896-4ccd-40b1-8746-160b190ce3fc in datapath cc3a228f-1352-43c0-b602-704afca624c0 unbound from our chassis
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.001 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3a228f-1352-43c0-b602-704afca624c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.002 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7fe595-0cf5-4bdf-b18b-8ad8c0fcdcb3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.003 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0 namespace which is not needed anymore
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.023 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 24 02:02:53 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 17.784s CPU time.
Nov 24 02:02:53 compute-0 systemd-machined[153319]: Machine qemu-6-instance-00000006 terminated.
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [NOTICE]   (215863) : haproxy version is 2.8.14-c23fe91
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [NOTICE]   (215863) : path to executable is /usr/sbin/haproxy
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [WARNING]  (215863) : Exiting Master process...
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [WARNING]  (215863) : Exiting Master process...
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.175 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [ALERT]    (215863) : Current worker (215865) exited with code 143 (Terminated)
Nov 24 02:02:53 compute-0 neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0[215859]: [WARNING]  (215863) : All workers exited. Exiting... (0)
Nov 24 02:02:53 compute-0 systemd[1]: libpod-5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33.scope: Deactivated successfully.
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.180 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 podman[216751]: 2025-11-24 02:02:53.183911139 +0000 UTC m=+0.056342713 container died 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.223 187003 INFO nova.virt.libvirt.driver [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance destroyed successfully.
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.225 187003 DEBUG nova.objects.instance [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:02:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33-userdata-shm.mount: Deactivated successfully.
Nov 24 02:02:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-d61bf364c4657eb6f481207e93ddc77354eaeff1d7df758528e6d43a1dbfcbb4-merged.mount: Deactivated successfully.
Nov 24 02:02:53 compute-0 podman[216751]: 2025-11-24 02:02:53.238948885 +0000 UTC m=+0.111380429 container cleanup 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.238 187003 DEBUG nova.virt.libvirt.vif [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-451741380',display_name='tempest-TestNetworkBasicOps-server-451741380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-451741380',id=6,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIOlvLPNNmxUKBzIpKOyb/g3frWeGInz2JXNMUrVlTX5PVoaMDkcVEEuE0Xc7nJGTXq6CiUNfoM4bMBC/gpgIh7GEelDp4kLP6jF3noekF1csb5EqA7bNS6wBojYQadW8w==',key_name='tempest-TestNetworkBasicOps-397830030',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:01:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ymrgaycl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:01:11Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=a2e5c148-2c17-4f54-a3d6-b5655b0e87f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.240 187003 DEBUG nova.network.os_vif_util [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.241 187003 DEBUG nova.network.os_vif_util [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.241 187003 DEBUG os_vif [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.243 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.243 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap175bb896-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.246 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.248 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:02:53 compute-0 systemd[1]: libpod-conmon-5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33.scope: Deactivated successfully.
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.250 187003 INFO os_vif [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:60:76,bridge_name='br-int',has_traffic_filtering=True,id=175bb896-4ccd-40b1-8746-160b190ce3fc,network=Network(cc3a228f-1352-43c0-b602-704afca624c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap175bb896-4c')
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.251 187003 INFO nova.virt.libvirt.driver [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Deleting instance files /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1_del
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.252 187003 INFO nova.virt.libvirt.driver [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Deletion of /var/lib/nova/instances/a2e5c148-2c17-4f54-a3d6-b5655b0e87f1_del complete
Nov 24 02:02:53 compute-0 podman[216797]: 2025-11-24 02:02:53.312775651 +0000 UTC m=+0.046104124 container remove 5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.318 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[126a2459-aacd-4847-b363-fd245972a909]: (4, ('Mon Nov 24 02:02:53 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0 (5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33)\n5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33\nMon Nov 24 02:02:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0 (5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33)\n5917528b4604e5c6ba2202077bf85bca57d10a2b77cced14414b90764713bc33\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.321 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6776c3d8-c97c-4d44-94e6-11298a852bdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.322 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc3a228f-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.323 187003 INFO nova.compute.manager [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.324 187003 DEBUG oslo.service.loopingcall [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.325 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 kernel: tapcc3a228f-10: left promiscuous mode
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.329 187003 DEBUG nova.compute.manager [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.330 187003 DEBUG nova.network.neutron [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:02:53 compute-0 nova_compute[186999]: 2025-11-24 02:02:53.337 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.341 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4a55f585-d371-4866-970e-05c3a39ba23e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.364 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1116e34e-7a90-4a68-a444-f1682b66f7f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.366 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a40196-9d06-4c72-acdd-dfb292a5b89f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.385 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e988a409-b6f3-482d-ba1f-be5b8261f1f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313732, 'reachable_time': 42705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216812, 'error': None, 'target': 'ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.389 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cc3a228f-1352-43c0-b602-704afca624c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:02:53 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:02:53.390 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9d58a0-8113-4be6-99f6-51ac819db063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:02:53 compute-0 systemd[1]: run-netns-ovnmeta\x2dcc3a228f\x2d1352\x2d43c0\x2db602\x2d704afca624c0.mount: Deactivated successfully.
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.672 187003 DEBUG nova.network.neutron [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.695 187003 INFO nova.compute.manager [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Took 1.36 seconds to deallocate network for instance.
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.736 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.737 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.784 187003 DEBUG nova.compute.provider_tree [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.798 187003 DEBUG nova.scheduler.client.report [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.803 187003 DEBUG nova.network.neutron [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updated VIF entry in instance network info cache for port 175bb896-4ccd-40b1-8746-160b190ce3fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.804 187003 DEBUG nova.network.neutron [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Updating instance_info_cache with network_info: [{"id": "175bb896-4ccd-40b1-8746-160b190ce3fc", "address": "fa:16:3e:95:60:76", "network": {"id": "cc3a228f-1352-43c0-b602-704afca624c0", "bridge": "br-int", "label": "tempest-network-smoke--2043076088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap175bb896-4c", "ovs_interfaceid": "175bb896-4ccd-40b1-8746-160b190ce3fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.821 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.827 187003 DEBUG oslo_concurrency.lockutils [req-58a8a44c-da58-4343-a5e7-790402bd733a req-4963a5d8-0977-44f3-bcd7-e0a1ec2c0a0c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.846 187003 INFO nova.scheduler.client.report [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.897 187003 DEBUG oslo_concurrency.lockutils [None req-3bea243e-0adc-4da4-96ef-27ee95779b75 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.956 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-unplugged-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.957 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.957 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.958 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.958 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-unplugged-175bb896-4ccd-40b1-8746-160b190ce3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.958 187003 WARNING nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-unplugged-175bb896-4ccd-40b1-8746-160b190ce3fc for instance with vm_state deleted and task_state None.
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.959 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.959 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.959 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.959 187003 DEBUG oslo_concurrency.lockutils [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "a2e5c148-2c17-4f54-a3d6-b5655b0e87f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.960 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] No waiting events found dispatching network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.960 187003 WARNING nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received unexpected event network-vif-plugged-175bb896-4ccd-40b1-8746-160b190ce3fc for instance with vm_state deleted and task_state None.
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.960 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Received event network-vif-deleted-175bb896-4ccd-40b1-8746-160b190ce3fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.961 187003 INFO nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Neutron deleted interface 175bb896-4ccd-40b1-8746-160b190ce3fc; detaching it from the instance and deleting it from the info cache
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.961 187003 DEBUG nova.network.neutron [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 24 02:02:54 compute-0 nova_compute[186999]: 2025-11-24 02:02:54.964 187003 DEBUG nova.compute.manager [req-24e8d878-6338-4689-a941-5a68315e72e1 req-a9d2c0bd-e84c-4fb0-b251-3c7b78a0696b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Detach interface failed, port_id=175bb896-4ccd-40b1-8746-160b190ce3fc, reason: Instance a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 24 02:02:55 compute-0 podman[216813]: 2025-11-24 02:02:55.816376492 +0000 UTC m=+0.069854294 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 24 02:02:57 compute-0 nova_compute[186999]: 2025-11-24 02:02:57.468 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:57 compute-0 nova_compute[186999]: 2025-11-24 02:02:57.521 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:57 compute-0 nova_compute[186999]: 2025-11-24 02:02:57.627 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:58 compute-0 nova_compute[186999]: 2025-11-24 02:02:58.247 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:02:59 compute-0 nova_compute[186999]: 2025-11-24 02:02:59.133 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949764.1321867, 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:02:59 compute-0 nova_compute[186999]: 2025-11-24 02:02:59.133 187003 INFO nova.compute.manager [-] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] VM Stopped (Lifecycle Event)
Nov 24 02:02:59 compute-0 nova_compute[186999]: 2025-11-24 02:02:59.149 187003 DEBUG nova.compute.manager [None req-85c9fd91-2a8c-4b05-a736-2d26d4dc7402 - - - - - -] [instance: 55a0d2d6-cb27-4e1e-8f22-5542afa59b1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:00 compute-0 podman[216834]: 2025-11-24 02:03:00.809164908 +0000 UTC m=+0.059759780 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal)
Nov 24 02:03:02 compute-0 sshd-session[216855]: Invalid user pivpn from 46.188.119.26 port 37466
Nov 24 02:03:02 compute-0 sshd-session[216855]: Received disconnect from 46.188.119.26 port 37466:11: Bye Bye [preauth]
Nov 24 02:03:02 compute-0 sshd-session[216855]: Disconnected from invalid user pivpn 46.188.119.26 port 37466 [preauth]
Nov 24 02:03:02 compute-0 nova_compute[186999]: 2025-11-24 02:03:02.470 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:03 compute-0 nova_compute[186999]: 2025-11-24 02:03:03.250 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:06 compute-0 podman[216857]: 2025-11-24 02:03:06.830169885 +0000 UTC m=+0.070397742 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:03:07 compute-0 nova_compute[186999]: 2025-11-24 02:03:07.473 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:08 compute-0 nova_compute[186999]: 2025-11-24 02:03:08.217 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949773.2162642, a2e5c148-2c17-4f54-a3d6-b5655b0e87f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:08 compute-0 nova_compute[186999]: 2025-11-24 02:03:08.218 187003 INFO nova.compute.manager [-] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] VM Stopped (Lifecycle Event)
Nov 24 02:03:08 compute-0 nova_compute[186999]: 2025-11-24 02:03:08.247 187003 DEBUG nova.compute.manager [None req-2b3f6498-1ea1-4e1a-b94e-2431623c7981 - - - - - -] [instance: a2e5c148-2c17-4f54-a3d6-b5655b0e87f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:08 compute-0 nova_compute[186999]: 2025-11-24 02:03:08.252 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:08 compute-0 nova_compute[186999]: 2025-11-24 02:03:08.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:03:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.615 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.615 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.631 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.714 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.715 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.723 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.724 187003 INFO nova.compute.claims [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.815 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:03:11 compute-0 podman[216879]: 2025-11-24 02:03:11.855416488 +0000 UTC m=+0.098790417 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.897 187003 DEBUG nova.compute.provider_tree [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.916 187003 DEBUG nova.scheduler.client.report [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.941 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.942 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.994 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:03:11 compute-0 nova_compute[186999]: 2025-11-24 02:03:11.995 187003 DEBUG nova.network.neutron [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.013 187003 INFO nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.033 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.109 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.110 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.111 187003 INFO nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Creating image(s)
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.111 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.111 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.112 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.127 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.195 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.196 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.197 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.207 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.280 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.281 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.318 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.319 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.320 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.380 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.382 187003 DEBUG nova.virt.disk.api [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.382 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.443 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.444 187003 DEBUG nova.virt.disk.api [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.444 187003 DEBUG nova.objects.instance [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a524177-cd22-46d4-adaf-8c8f552f6edf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.456 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.456 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Ensure instance console log exists: /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.456 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.457 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.457 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.475 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:12 compute-0 nova_compute[186999]: 2025-11-24 02:03:12.804 187003 DEBUG nova.policy [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.255 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:13 compute-0 nova_compute[186999]: 2025-11-24 02:03:13.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.343 187003 DEBUG nova.network.neutron [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Successfully updated port: a39e27b7-ff8f-4834-a397-2a7e27da88db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.356 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.357 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.357 187003 DEBUG nova.network.neutron [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.435 187003 DEBUG nova.compute.manager [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.436 187003 DEBUG nova.compute.manager [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Refreshing instance network info cache due to event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.436 187003 DEBUG oslo_concurrency.lockutils [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.478 187003 DEBUG nova.network.neutron [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.797 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.797 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.798 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.798 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:03:14 compute-0 podman[216913]: 2025-11-24 02:03:14.833028554 +0000 UTC m=+0.087635395 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.970 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.971 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5755MB free_disk=73.45990753173828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.971 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:14 compute-0 nova_compute[186999]: 2025-11-24 02:03:14.971 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.030 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 3a524177-cd22-46d4-adaf-8c8f552f6edf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.031 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.031 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.075 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.085 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.099 187003 DEBUG nova.network.neutron [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updating instance_info_cache with network_info: [{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.100 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.101 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.132 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.133 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Instance network_info: |[{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.133 187003 DEBUG oslo_concurrency.lockutils [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.133 187003 DEBUG nova.network.neutron [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Refreshing network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.136 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Start _get_guest_xml network_info=[{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.140 187003 WARNING nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.144 187003 DEBUG nova.virt.libvirt.host [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.144 187003 DEBUG nova.virt.libvirt.host [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.151 187003 DEBUG nova.virt.libvirt.host [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.152 187003 DEBUG nova.virt.libvirt.host [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.152 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.153 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.153 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.153 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.153 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.154 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.154 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.154 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.154 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.154 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.155 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.155 187003 DEBUG nova.virt.hardware [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.159 187003 DEBUG nova.virt.libvirt.vif [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-49184560',display_name='tempest-TestNetworkBasicOps-server-49184560',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-49184560',id=8,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI1WLvi3AJ85/KXgk1ohOyBjOX/vudaYuGwdPytppCf4sP4KRpRbR+BRoMXeEJ6gyyBfcOqkqrLTYAdD8u5AvZ+mvAyM8awrRCLmgYaOpzljyb52YGjW7hjKUdknhU+Snw==',key_name='tempest-TestNetworkBasicOps-373202742',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-36js87h4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:03:12Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=3a524177-cd22-46d4-adaf-8c8f552f6edf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.159 187003 DEBUG nova.network.os_vif_util [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.160 187003 DEBUG nova.network.os_vif_util [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.161 187003 DEBUG nova.objects.instance [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a524177-cd22-46d4-adaf-8c8f552f6edf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.175 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <uuid>3a524177-cd22-46d4-adaf-8c8f552f6edf</uuid>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <name>instance-00000008</name>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-49184560</nova:name>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:03:15</nova:creationTime>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         <nova:port uuid="a39e27b7-ff8f-4834-a397-2a7e27da88db">
Nov 24 02:03:15 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <system>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="serial">3a524177-cd22-46d4-adaf-8c8f552f6edf</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="uuid">3a524177-cd22-46d4-adaf-8c8f552f6edf</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </system>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <os>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </os>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <features>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </features>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.config"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:e1:03:4b"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <target dev="tapa39e27b7-ff"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/console.log" append="off"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <video>
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </video>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:03:15 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:03:15 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:03:15 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:03:15 compute-0 nova_compute[186999]: </domain>
Nov 24 02:03:15 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.176 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Preparing to wait for external event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.177 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.177 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.177 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.178 187003 DEBUG nova.virt.libvirt.vif [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-49184560',display_name='tempest-TestNetworkBasicOps-server-49184560',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-49184560',id=8,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI1WLvi3AJ85/KXgk1ohOyBjOX/vudaYuGwdPytppCf4sP4KRpRbR+BRoMXeEJ6gyyBfcOqkqrLTYAdD8u5AvZ+mvAyM8awrRCLmgYaOpzljyb52YGjW7hjKUdknhU+Snw==',key_name='tempest-TestNetworkBasicOps-373202742',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-36js87h4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:03:12Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=3a524177-cd22-46d4-adaf-8c8f552f6edf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.178 187003 DEBUG nova.network.os_vif_util [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.179 187003 DEBUG nova.network.os_vif_util [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.179 187003 DEBUG os_vif [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.180 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.180 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.181 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.184 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.185 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa39e27b7-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.185 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa39e27b7-ff, col_values=(('external_ids', {'iface-id': 'a39e27b7-ff8f-4834-a397-2a7e27da88db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:03:4b', 'vm-uuid': '3a524177-cd22-46d4-adaf-8c8f552f6edf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.187 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:15 compute-0 NetworkManager[55458]: <info>  [1763949795.1883] manager: (tapa39e27b7-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.190 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.196 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.197 187003 INFO os_vif [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff')
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.234 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.235 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.235 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:e1:03:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.236 187003 INFO nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Using config drive
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.901 187003 INFO nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Creating config drive at /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.config
Nov 24 02:03:15 compute-0 nova_compute[186999]: 2025-11-24 02:03:15.907 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp40ilh5ri execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.033 187003 DEBUG oslo_concurrency.processutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp40ilh5ri" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:16 compute-0 kernel: tapa39e27b7-ff: entered promiscuous mode
Nov 24 02:03:16 compute-0 ovn_controller[95380]: 2025-11-24T02:03:16Z|00106|binding|INFO|Claiming lport a39e27b7-ff8f-4834-a397-2a7e27da88db for this chassis.
Nov 24 02:03:16 compute-0 ovn_controller[95380]: 2025-11-24T02:03:16Z|00107|binding|INFO|a39e27b7-ff8f-4834-a397-2a7e27da88db: Claiming fa:16:3e:e1:03:4b 10.100.0.12
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.146 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.151 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.160 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:03:4b 10.100.0.12'], port_security=['fa:16:3e:e1:03:4b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3a524177-cd22-46d4-adaf-8c8f552f6edf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-343d4572-e2f0-409b-ab04-cec98c332a12', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f73b816f-fd8d-4071-9e47-7ee8bf6ad1c5, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a39e27b7-ff8f-4834-a397-2a7e27da88db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.162 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a39e27b7-ff8f-4834-a397-2a7e27da88db in datapath 343d4572-e2f0-409b-ab04-cec98c332a12 bound to our chassis
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.163 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.1663] manager: (tapa39e27b7-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.180 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7b66dfac-733c-4495-86a0-6a315660f38d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.181 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap343d4572-e1 in ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.184 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap343d4572-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.184 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ab53a04f-3210-4123-a343-d23ac1b3a6de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.185 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[be3d77dc-a085-43b5-9976-99792418d009]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.200 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[da83e128-db58-4fd6-ad80-ea68acfb1286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.209 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 ovn_controller[95380]: 2025-11-24T02:03:16Z|00108|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db ovn-installed in OVS
Nov 24 02:03:16 compute-0 ovn_controller[95380]: 2025-11-24T02:03:16Z|00109|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db up in Southbound
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.216 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 systemd-machined[153319]: New machine qemu-8-instance-00000008.
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.218 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[487aff55-e1f8-489e-8ec3-b0d5cfc9733d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 systemd-udevd[216992]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:03:16 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 24 02:03:16 compute-0 podman[216946]: 2025-11-24 02:03:16.241570605 +0000 UTC m=+0.099310282 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.2444] device (tapa39e27b7-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.2455] device (tapa39e27b7-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.256 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e3434d06-39ab-401b-b402-6d9d728b8908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.2640] manager: (tap343d4572-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.263 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7157a2be-fafc-41c5-b487-ccbefb8a7407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 podman[216947]: 2025-11-24 02:03:16.265555437 +0000 UTC m=+0.124356453 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.296 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[3222a0e7-b241-4146-a2a9-0592dad69c9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.300 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[607d77eb-9146-4f04-9616-750d4ea5d198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.3264] device (tap343d4572-e0): carrier: link connected
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.331 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[85a460b2-ad33-4963-819d-5fa3904c2329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.350 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc02935-f7b1-4a3f-bd5f-edb7ee774925]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap343d4572-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:5f:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326299, 'reachable_time': 27498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217034, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.368 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[dde8f4d1-4509-4152-81d7-c08624de7782]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:5f1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326299, 'tstamp': 326299}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217035, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.390 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[5786a5c4-b74f-4f06-a20b-42e4aef058a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap343d4572-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:5f:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326299, 'reachable_time': 27498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217036, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.428 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c1713f-fdbc-459f-b741-2fbbedf79b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.504 187003 DEBUG nova.compute.manager [req-0d702dc3-34d7-4faa-bff5-970d472c15a2 req-006c4239-0fcb-416e-a912-e7ad52f99a10 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.505 187003 DEBUG oslo_concurrency.lockutils [req-0d702dc3-34d7-4faa-bff5-970d472c15a2 req-006c4239-0fcb-416e-a912-e7ad52f99a10 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.505 187003 DEBUG oslo_concurrency.lockutils [req-0d702dc3-34d7-4faa-bff5-970d472c15a2 req-006c4239-0fcb-416e-a912-e7ad52f99a10 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.505 187003 DEBUG oslo_concurrency.lockutils [req-0d702dc3-34d7-4faa-bff5-970d472c15a2 req-006c4239-0fcb-416e-a912-e7ad52f99a10 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.505 187003 DEBUG nova.compute.manager [req-0d702dc3-34d7-4faa-bff5-970d472c15a2 req-006c4239-0fcb-416e-a912-e7ad52f99a10 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Processing event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.509 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6c60a6f8-94a1-45b1-8630-828380a6a0ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.511 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap343d4572-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.512 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.512 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap343d4572-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:16 compute-0 NetworkManager[55458]: <info>  [1763949796.5152] manager: (tap343d4572-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 24 02:03:16 compute-0 kernel: tap343d4572-e0: entered promiscuous mode
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.514 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.518 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap343d4572-e0, col_values=(('external_ids', {'iface-id': '028f6b13-a924-41b7-9c48-ff7f99586f03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.519 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 ovn_controller[95380]: 2025-11-24T02:03:16Z|00110|binding|INFO|Releasing lport 028f6b13-a924-41b7-9c48-ff7f99586f03 from this chassis (sb_readonly=0)
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.532 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.533 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.533 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.534 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec8540b-e3b0-41fc-9026-ec197d634c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.535 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:03:16 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:16.536 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'env', 'PROCESS_TAG=haproxy-343d4572-e2f0-409b-ab04-cec98c332a12', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/343d4572-e2f0-409b-ab04-cec98c332a12.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.729 187003 DEBUG nova.network.neutron [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updated VIF entry in instance network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.730 187003 DEBUG nova.network.neutron [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updating instance_info_cache with network_info: [{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:16 compute-0 nova_compute[186999]: 2025-11-24 02:03:16.745 187003 DEBUG oslo_concurrency.lockutils [req-5ab23031-b88a-4fc1-81da-3c463337d24e req-a39e2cd2-644c-4b29-9635-63affc80bed9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:03:16 compute-0 podman[217067]: 2025-11-24 02:03:16.957209423 +0000 UTC m=+0.050545206 container create 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:03:17 compute-0 systemd[1]: Started libpod-conmon-0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7.scope.
Nov 24 02:03:17 compute-0 podman[217067]: 2025-11-24 02:03:16.931411721 +0000 UTC m=+0.024747534 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:03:17 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:03:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb8379444cd030c35d4cd25eb8035e2b53cfd28122ae451bcd1e6f3b852fe2dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:03:17 compute-0 podman[217067]: 2025-11-24 02:03:17.051239376 +0000 UTC m=+0.144575189 container init 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 24 02:03:17 compute-0 podman[217067]: 2025-11-24 02:03:17.057116301 +0000 UTC m=+0.150452094 container start 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 02:03:17 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [NOTICE]   (217086) : New worker (217088) forked
Nov 24 02:03:17 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [NOTICE]   (217086) : Loading success.
Nov 24 02:03:17 compute-0 nova_compute[186999]: 2025-11-24 02:03:17.520 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.084 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949798.0832977, 3a524177-cd22-46d4-adaf-8c8f552f6edf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.084 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] VM Started (Lifecycle Event)
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.087 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.091 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.095 187003 INFO nova.virt.libvirt.driver [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Instance spawned successfully.
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.096 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.099 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.101 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.101 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.109 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.116 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.117 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.118 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.118 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.119 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.119 187003 DEBUG nova.virt.libvirt.driver [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.126 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.127 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949798.0846562, 3a524177-cd22-46d4-adaf-8c8f552f6edf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.127 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] VM Paused (Lifecycle Event)
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.144 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.149 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949798.08991, 3a524177-cd22-46d4-adaf-8c8f552f6edf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.150 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] VM Resumed (Lifecycle Event)
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.169 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.173 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.176 187003 INFO nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Took 6.07 seconds to spawn the instance on the hypervisor.
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.176 187003 DEBUG nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.201 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.226 187003 INFO nova.compute.manager [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Took 6.55 seconds to build instance.
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.238 187003 DEBUG oslo_concurrency.lockutils [None req-4de775cb-ca75-491a-95ee-db06a44392f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.581 187003 DEBUG nova.compute.manager [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.581 187003 DEBUG oslo_concurrency.lockutils [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.582 187003 DEBUG oslo_concurrency.lockutils [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.582 187003 DEBUG oslo_concurrency.lockutils [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.582 187003 DEBUG nova.compute.manager [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] No waiting events found dispatching network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:18 compute-0 nova_compute[186999]: 2025-11-24 02:03:18.582 187003 WARNING nova.compute.manager [req-1080a52a-63e1-4be0-a12e-17d806362005 req-3a6625b5-45d7-4ffe-a97d-c4705c2ea440 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received unexpected event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with vm_state active and task_state None.
Nov 24 02:03:20 compute-0 nova_compute[186999]: 2025-11-24 02:03:20.187 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:22 compute-0 nova_compute[186999]: 2025-11-24 02:03:22.539 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.077 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 NetworkManager[55458]: <info>  [1763949803.0787] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 24 02:03:23 compute-0 NetworkManager[55458]: <info>  [1763949803.0802] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 24 02:03:23 compute-0 ovn_controller[95380]: 2025-11-24T02:03:23Z|00111|binding|INFO|Releasing lport 028f6b13-a924-41b7-9c48-ff7f99586f03 from this chassis (sb_readonly=0)
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.103 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 ovn_controller[95380]: 2025-11-24T02:03:23Z|00112|binding|INFO|Releasing lport 028f6b13-a924-41b7-9c48-ff7f99586f03 from this chassis (sb_readonly=0)
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.111 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.283 187003 DEBUG nova.compute.manager [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.283 187003 DEBUG nova.compute.manager [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Refreshing instance network info cache due to event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.283 187003 DEBUG oslo_concurrency.lockutils [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.284 187003 DEBUG oslo_concurrency.lockutils [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.284 187003 DEBUG nova.network.neutron [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Refreshing network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.442 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.442 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.443 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.443 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.443 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.444 187003 INFO nova.compute.manager [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Terminating instance
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.446 187003 DEBUG nova.compute.manager [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:03:23 compute-0 kernel: tapa39e27b7-ff (unregistering): left promiscuous mode
Nov 24 02:03:23 compute-0 NetworkManager[55458]: <info>  [1763949803.4837] device (tapa39e27b7-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:03:23 compute-0 ovn_controller[95380]: 2025-11-24T02:03:23Z|00113|binding|INFO|Releasing lport a39e27b7-ff8f-4834-a397-2a7e27da88db from this chassis (sb_readonly=0)
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.495 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 ovn_controller[95380]: 2025-11-24T02:03:23Z|00114|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db down in Southbound
Nov 24 02:03:23 compute-0 ovn_controller[95380]: 2025-11-24T02:03:23Z|00115|binding|INFO|Removing iface tapa39e27b7-ff ovn-installed in OVS
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.505 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:03:4b 10.100.0.12'], port_security=['fa:16:3e:e1:03:4b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3a524177-cd22-46d4-adaf-8c8f552f6edf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-343d4572-e2f0-409b-ab04-cec98c332a12', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f73b816f-fd8d-4071-9e47-7ee8bf6ad1c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a39e27b7-ff8f-4834-a397-2a7e27da88db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.507 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a39e27b7-ff8f-4834-a397-2a7e27da88db in datapath 343d4572-e2f0-409b-ab04-cec98c332a12 unbound from our chassis
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.508 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 343d4572-e2f0-409b-ab04-cec98c332a12, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.510 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b4795857-e096-4cea-b6e0-b0eb840b44c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.510 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 namespace which is not needed anymore
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.515 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 24 02:03:23 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 7.336s CPU time.
Nov 24 02:03:23 compute-0 systemd-machined[153319]: Machine qemu-8-instance-00000008 terminated.
Nov 24 02:03:23 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [NOTICE]   (217086) : haproxy version is 2.8.14-c23fe91
Nov 24 02:03:23 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [NOTICE]   (217086) : path to executable is /usr/sbin/haproxy
Nov 24 02:03:23 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [ALERT]    (217086) : Current worker (217088) exited with code 143 (Terminated)
Nov 24 02:03:23 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217082]: [WARNING]  (217086) : All workers exited. Exiting... (0)
Nov 24 02:03:23 compute-0 systemd[1]: libpod-0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7.scope: Deactivated successfully.
Nov 24 02:03:23 compute-0 podman[217128]: 2025-11-24 02:03:23.662064176 +0000 UTC m=+0.044333052 container died 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 02:03:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7-userdata-shm.mount: Deactivated successfully.
Nov 24 02:03:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-eb8379444cd030c35d4cd25eb8035e2b53cfd28122ae451bcd1e6f3b852fe2dc-merged.mount: Deactivated successfully.
Nov 24 02:03:23 compute-0 podman[217128]: 2025-11-24 02:03:23.709986408 +0000 UTC m=+0.092255284 container cleanup 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 02:03:23 compute-0 systemd[1]: libpod-conmon-0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7.scope: Deactivated successfully.
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.726 187003 INFO nova.virt.libvirt.driver [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Instance destroyed successfully.
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.727 187003 DEBUG nova.objects.instance [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 3a524177-cd22-46d4-adaf-8c8f552f6edf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.737 187003 DEBUG nova.virt.libvirt.vif [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-49184560',display_name='tempest-TestNetworkBasicOps-server-49184560',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-49184560',id=8,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI1WLvi3AJ85/KXgk1ohOyBjOX/vudaYuGwdPytppCf4sP4KRpRbR+BRoMXeEJ6gyyBfcOqkqrLTYAdD8u5AvZ+mvAyM8awrRCLmgYaOpzljyb52YGjW7hjKUdknhU+Snw==',key_name='tempest-TestNetworkBasicOps-373202742',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:03:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-36js87h4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:03:18Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=3a524177-cd22-46d4-adaf-8c8f552f6edf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.738 187003 DEBUG nova.network.os_vif_util [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.738 187003 DEBUG nova.network.os_vif_util [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.739 187003 DEBUG os_vif [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.740 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.741 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa39e27b7-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.743 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.745 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.749 187003 INFO os_vif [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff')
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.750 187003 INFO nova.virt.libvirt.driver [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Deleting instance files /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf_del
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.750 187003 INFO nova.virt.libvirt.driver [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Deletion of /var/lib/nova/instances/3a524177-cd22-46d4-adaf-8c8f552f6edf_del complete
Nov 24 02:03:23 compute-0 podman[217174]: 2025-11-24 02:03:23.779682119 +0000 UTC m=+0.042609434 container remove 0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.785 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0885f407-57f1-4647-aee1-d0064be1aeb7]: (4, ('Mon Nov 24 02:03:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 (0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7)\n0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7\nMon Nov 24 02:03:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 (0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7)\n0839eb05f36549063d9ee77864dac37cb543575b7d061f1883968d92b85682e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.787 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[124853bc-3880-43fa-900f-b4f7b4b24a4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.788 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap343d4572-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:23 compute-0 kernel: tap343d4572-e0: left promiscuous mode
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.790 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.799 187003 INFO nova.compute.manager [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.799 187003 DEBUG oslo.service.loopingcall [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.799 187003 DEBUG nova.compute.manager [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.800 187003 DEBUG nova.network.neutron [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:03:23 compute-0 nova_compute[186999]: 2025-11-24 02:03:23.803 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.804 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4037d780-1b53-4bd9-8fa8-4c7182324309]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.822 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6b4522-3cbd-4ec6-900a-3b8ad939432f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.823 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6acec543-8391-456a-b0c3-bbb5fcb58df2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.841 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0a12f675-6f31-4db1-93c4-6ead71fe1aba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326291, 'reachable_time': 31240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217189, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.845 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:03:23 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:23.846 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[2467408e-f0ce-486f-82b6-697f49a63c0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:23 compute-0 systemd[1]: run-netns-ovnmeta\x2d343d4572\x2de2f0\x2d409b\x2dab04\x2dcec98c332a12.mount: Deactivated successfully.
Nov 24 02:03:24 compute-0 nova_compute[186999]: 2025-11-24 02:03:24.528 187003 DEBUG nova.network.neutron [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updated VIF entry in instance network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:03:24 compute-0 nova_compute[186999]: 2025-11-24 02:03:24.529 187003 DEBUG nova.network.neutron [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updating instance_info_cache with network_info: [{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:24 compute-0 nova_compute[186999]: 2025-11-24 02:03:24.547 187003 DEBUG oslo_concurrency.lockutils [req-d533844d-ebfb-403c-8c13-15fdd609838e req-91828b64-586a-490c-b071-29956335683c 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-3a524177-cd22-46d4-adaf-8c8f552f6edf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.199 187003 DEBUG nova.network.neutron [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.216 187003 INFO nova.compute.manager [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Took 1.42 seconds to deallocate network for instance.
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.262 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.263 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.340 187003 DEBUG nova.compute.provider_tree [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.354 187003 DEBUG nova.scheduler.client.report [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.369 187003 DEBUG nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.369 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.370 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.371 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.371 187003 DEBUG nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] No waiting events found dispatching network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.372 187003 WARNING nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received unexpected event network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with vm_state deleted and task_state None.
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.372 187003 DEBUG nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.373 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.373 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.373 187003 DEBUG oslo_concurrency.lockutils [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.374 187003 DEBUG nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] No waiting events found dispatching network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.374 187003 WARNING nova.compute.manager [req-cf71dae7-66ff-4eaf-aadf-f5faadc77eb7 req-5606276e-a964-49b2-b8ff-d32fa3a8af0a 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Received unexpected event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with vm_state deleted and task_state None.
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.377 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.404 187003 INFO nova.scheduler.client.report [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 3a524177-cd22-46d4-adaf-8c8f552f6edf
Nov 24 02:03:25 compute-0 nova_compute[186999]: 2025-11-24 02:03:25.485 187003 DEBUG oslo_concurrency.lockutils [None req-64893cbb-6430-4494-9242-a2f816729dca e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "3a524177-cd22-46d4-adaf-8c8f552f6edf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:26 compute-0 podman[217190]: 2025-11-24 02:03:26.833048558 +0000 UTC m=+0.074046034 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:03:27 compute-0 nova_compute[186999]: 2025-11-24 02:03:27.541 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:28 compute-0 nova_compute[186999]: 2025-11-24 02:03:28.743 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:31 compute-0 podman[217210]: 2025-11-24 02:03:31.84399228 +0000 UTC m=+0.094012804 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 24 02:03:32 compute-0 nova_compute[186999]: 2025-11-24 02:03:32.542 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:33 compute-0 nova_compute[186999]: 2025-11-24 02:03:33.747 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.771 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.771 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.787 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.866 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.867 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.876 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:03:34 compute-0 nova_compute[186999]: 2025-11-24 02:03:34.876 187003 INFO nova.compute.claims [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.004 187003 DEBUG nova.compute.provider_tree [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.017 187003 DEBUG nova.scheduler.client.report [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.035 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.036 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.081 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.082 187003 DEBUG nova.network.neutron [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.097 187003 INFO nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.113 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.219 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.221 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.222 187003 INFO nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Creating image(s)
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.222 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.223 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.224 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.240 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.314 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.315 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.316 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.330 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.413 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.415 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.455 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.456 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.457 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.516 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.518 187003 DEBUG nova.virt.disk.api [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.519 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.581 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.582 187003 DEBUG nova.virt.disk.api [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.582 187003 DEBUG nova.objects.instance [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.593 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.593 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Ensure instance console log exists: /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.594 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.594 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.594 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:35 compute-0 nova_compute[186999]: 2025-11-24 02:03:35.758 187003 DEBUG nova.policy [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.610 187003 DEBUG nova.network.neutron [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Successfully updated port: a39e27b7-ff8f-4834-a397-2a7e27da88db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.628 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.629 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.629 187003 DEBUG nova.network.neutron [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.700 187003 DEBUG nova.compute.manager [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.701 187003 DEBUG nova.compute.manager [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Refreshing instance network info cache due to event network-changed-a39e27b7-ff8f-4834-a397-2a7e27da88db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.701 187003 DEBUG oslo_concurrency.lockutils [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:03:36 compute-0 nova_compute[186999]: 2025-11-24 02:03:36.801 187003 DEBUG nova.network.neutron [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.545 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.781 187003 DEBUG nova.network.neutron [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Updating instance_info_cache with network_info: [{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.800 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.801 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Instance network_info: |[{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.801 187003 DEBUG oslo_concurrency.lockutils [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.801 187003 DEBUG nova.network.neutron [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Refreshing network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.805 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Start _get_guest_xml network_info=[{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.818 187003 WARNING nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:03:37 compute-0 podman[217247]: 2025-11-24 02:03:37.822991178 +0000 UTC m=+0.070816764 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.826 187003 DEBUG nova.virt.libvirt.host [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.827 187003 DEBUG nova.virt.libvirt.host [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.830 187003 DEBUG nova.virt.libvirt.host [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.831 187003 DEBUG nova.virt.libvirt.host [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.831 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.832 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.832 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.833 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.833 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.833 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.833 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.834 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.834 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.834 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.835 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.835 187003 DEBUG nova.virt.hardware [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.840 187003 DEBUG nova.virt.libvirt.vif [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:03:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1565072121',display_name='tempest-TestNetworkBasicOps-server-1565072121',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1565072121',id=9,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIecYNXXzdra3Hp+3H0wXlNhREjEQhN3oVwZULmujJNKTM3eDfkMuVQGp0zVkHPRf0n7GeU4y9YuJ2u2FDO0pmLfVKKb6/RpO2jYcsmk1OcE6B7oHXX0jzFeu98ATvUOuQ==',key_name='tempest-TestNetworkBasicOps-934107515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-p6iym5ua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:03:35Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=2f6f3f4f-82fd-4f26-96ef-89afb6dc811e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.840 187003 DEBUG nova.network.os_vif_util [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.841 187003 DEBUG nova.network.os_vif_util [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.843 187003 DEBUG nova.objects.instance [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.857 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <uuid>2f6f3f4f-82fd-4f26-96ef-89afb6dc811e</uuid>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <name>instance-00000009</name>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-1565072121</nova:name>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:03:37</nova:creationTime>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         <nova:port uuid="a39e27b7-ff8f-4834-a397-2a7e27da88db">
Nov 24 02:03:37 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <system>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="serial">2f6f3f4f-82fd-4f26-96ef-89afb6dc811e</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="uuid">2f6f3f4f-82fd-4f26-96ef-89afb6dc811e</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </system>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <os>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </os>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <features>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </features>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.config"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:e1:03:4b"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <target dev="tapa39e27b7-ff"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/console.log" append="off"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <video>
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </video>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:03:37 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:03:37 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:03:37 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:03:37 compute-0 nova_compute[186999]: </domain>
Nov 24 02:03:37 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.858 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Preparing to wait for external event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.858 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.859 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.859 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.860 187003 DEBUG nova.virt.libvirt.vif [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:03:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1565072121',display_name='tempest-TestNetworkBasicOps-server-1565072121',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1565072121',id=9,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIecYNXXzdra3Hp+3H0wXlNhREjEQhN3oVwZULmujJNKTM3eDfkMuVQGp0zVkHPRf0n7GeU4y9YuJ2u2FDO0pmLfVKKb6/RpO2jYcsmk1OcE6B7oHXX0jzFeu98ATvUOuQ==',key_name='tempest-TestNetworkBasicOps-934107515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-p6iym5ua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:03:35Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=2f6f3f4f-82fd-4f26-96ef-89afb6dc811e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.860 187003 DEBUG nova.network.os_vif_util [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.861 187003 DEBUG nova.network.os_vif_util [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.861 187003 DEBUG os_vif [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.861 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.862 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.862 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.866 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.866 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa39e27b7-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.866 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa39e27b7-ff, col_values=(('external_ids', {'iface-id': 'a39e27b7-ff8f-4834-a397-2a7e27da88db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:03:4b', 'vm-uuid': '2f6f3f4f-82fd-4f26-96ef-89afb6dc811e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.868 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:37 compute-0 NetworkManager[55458]: <info>  [1763949817.8706] manager: (tapa39e27b7-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.870 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.876 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.877 187003 INFO os_vif [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff')
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.944 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.944 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.945 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:e1:03:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:03:37 compute-0 nova_compute[186999]: 2025-11-24 02:03:37.945 187003 INFO nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Using config drive
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.670 187003 INFO nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Creating config drive at /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.config
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.679 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp13wfftae execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.724 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949803.723304, 3a524177-cd22-46d4-adaf-8c8f552f6edf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.725 187003 INFO nova.compute.manager [-] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] VM Stopped (Lifecycle Event)
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.749 187003 DEBUG nova.compute.manager [None req-cabf8a22-361d-4dd4-87fb-d211fa3624e5 - - - - - -] [instance: 3a524177-cd22-46d4-adaf-8c8f552f6edf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.811 187003 DEBUG oslo_concurrency.processutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp13wfftae" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:03:38 compute-0 kernel: tapa39e27b7-ff: entered promiscuous mode
Nov 24 02:03:38 compute-0 NetworkManager[55458]: <info>  [1763949818.8811] manager: (tapa39e27b7-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 24 02:03:38 compute-0 ovn_controller[95380]: 2025-11-24T02:03:38Z|00116|binding|INFO|Claiming lport a39e27b7-ff8f-4834-a397-2a7e27da88db for this chassis.
Nov 24 02:03:38 compute-0 ovn_controller[95380]: 2025-11-24T02:03:38Z|00117|binding|INFO|a39e27b7-ff8f-4834-a397-2a7e27da88db: Claiming fa:16:3e:e1:03:4b 10.100.0.12
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.883 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.893 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:03:4b 10.100.0.12'], port_security=['fa:16:3e:e1:03:4b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2f6f3f4f-82fd-4f26-96ef-89afb6dc811e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-343d4572-e2f0-409b-ab04-cec98c332a12', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '7', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f73b816f-fd8d-4071-9e47-7ee8bf6ad1c5, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a39e27b7-ff8f-4834-a397-2a7e27da88db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.897 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a39e27b7-ff8f-4834-a397-2a7e27da88db in datapath 343d4572-e2f0-409b-ab04-cec98c332a12 bound to our chassis
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.899 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:38 compute-0 ovn_controller[95380]: 2025-11-24T02:03:38Z|00118|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db ovn-installed in OVS
Nov 24 02:03:38 compute-0 ovn_controller[95380]: 2025-11-24T02:03:38Z|00119|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db up in Southbound
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.904 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:38 compute-0 nova_compute[186999]: 2025-11-24 02:03:38.912 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.916 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0c631491-1f62-422a-ad25-a25ca15a40c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.917 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap343d4572-e1 in ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.920 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap343d4572-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.921 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f50cd94f-9b5f-454a-b7cf-5978f1dd3348]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.922 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[de7c5f39-4a79-47bd-beb9-73c7e4912520]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.936 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[467c9534-9973-4868-bf22-e469db76f739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:38 compute-0 systemd-machined[153319]: New machine qemu-9-instance-00000009.
Nov 24 02:03:38 compute-0 systemd-udevd[217293]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:03:38 compute-0 NetworkManager[55458]: <info>  [1763949818.9587] device (tapa39e27b7-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:03:38 compute-0 NetworkManager[55458]: <info>  [1763949818.9599] device (tapa39e27b7-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:03:38 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Nov 24 02:03:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:38.963 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[59609dd5-5ec4-49e0-9676-ca17f9998709]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.004 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[be6157bd-a981-4480-bf77-cf6dd53a63f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.011 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1328197d-681c-41f0-9afa-4fc241c3e91f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 NetworkManager[55458]: <info>  [1763949819.0128] manager: (tap343d4572-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.056 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[649e0be4-edbc-48a8-80c5-0425db2eacf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.061 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[3c86fb01-6c00-4840-955c-59b2b1e84a17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 NetworkManager[55458]: <info>  [1763949819.0896] device (tap343d4572-e0): carrier: link connected
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.090 187003 DEBUG nova.compute.manager [req-943b59de-8a8e-4083-a50b-458275eee557 req-bc93cb28-e9dd-4b88-ad61-1da7f0d3c112 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.091 187003 DEBUG oslo_concurrency.lockutils [req-943b59de-8a8e-4083-a50b-458275eee557 req-bc93cb28-e9dd-4b88-ad61-1da7f0d3c112 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.092 187003 DEBUG oslo_concurrency.lockutils [req-943b59de-8a8e-4083-a50b-458275eee557 req-bc93cb28-e9dd-4b88-ad61-1da7f0d3c112 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.092 187003 DEBUG oslo_concurrency.lockutils [req-943b59de-8a8e-4083-a50b-458275eee557 req-bc93cb28-e9dd-4b88-ad61-1da7f0d3c112 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.092 187003 DEBUG nova.compute.manager [req-943b59de-8a8e-4083-a50b-458275eee557 req-bc93cb28-e9dd-4b88-ad61-1da7f0d3c112 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Processing event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.098 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f885bc-c964-4897-b517-22f192a7eecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.115 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ed7851-b15f-4ea2-825a-3837512482c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap343d4572-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:5f:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328575, 'reachable_time': 43301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217324, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.129 187003 DEBUG nova.network.neutron [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Updated VIF entry in instance network info cache for port a39e27b7-ff8f-4834-a397-2a7e27da88db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.130 187003 DEBUG nova.network.neutron [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Updating instance_info_cache with network_info: [{"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.133 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[6b370d19-706e-45e0-b2ac-7c7c5b22bd0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:5f1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 328575, 'tstamp': 328575}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217325, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.144 187003 DEBUG oslo_concurrency.lockutils [req-c8233dd4-51d0-40fa-9f55-094615b81841 req-463293a2-0017-4f6d-a84a-3e95dc0b5f32 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.152 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e462a6e1-bc63-4a7d-be35-7dca24d85966]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap343d4572-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:5f:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328575, 'reachable_time': 43301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217326, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.189 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a86e6ec3-6fa3-495f-85a1-adeb8aa40da7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.283 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9536538b-663a-49c7-8fa8-454836066969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.285 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap343d4572-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.285 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.286 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap343d4572-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.287 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:39 compute-0 kernel: tap343d4572-e0: entered promiscuous mode
Nov 24 02:03:39 compute-0 NetworkManager[55458]: <info>  [1763949819.2892] manager: (tap343d4572-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.295 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap343d4572-e0, col_values=(('external_ids', {'iface-id': '028f6b13-a924-41b7-9c48-ff7f99586f03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.296 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:39 compute-0 ovn_controller[95380]: 2025-11-24T02:03:39Z|00120|binding|INFO|Releasing lport 028f6b13-a924-41b7-9c48-ff7f99586f03 from this chassis (sb_readonly=0)
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.297 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.297 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.302 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[40b97274-df28-4cf0-b8af-0b84797e479c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.303 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/343d4572-e2f0-409b-ab04-cec98c332a12.pid.haproxy
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 343d4572-e2f0-409b-ab04-cec98c332a12
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:03:39 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:39.304 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'env', 'PROCESS_TAG=haproxy-343d4572-e2f0-409b-ab04-cec98c332a12', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/343d4572-e2f0-409b-ab04-cec98c332a12.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.307 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.335 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.336 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949819.3350587, 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.336 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] VM Started (Lifecycle Event)
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.341 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.345 187003 INFO nova.virt.libvirt.driver [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Instance spawned successfully.
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.346 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.371 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.377 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.381 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.382 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.383 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.383 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.383 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.384 187003 DEBUG nova.virt.libvirt.driver [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.422 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.423 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949819.3387315, 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.423 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] VM Paused (Lifecycle Event)
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.442 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.449 187003 INFO nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Took 4.23 seconds to spawn the instance on the hypervisor.
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.450 187003 DEBUG nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.455 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949819.3403177, 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.455 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] VM Resumed (Lifecycle Event)
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.479 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.482 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.502 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.517 187003 INFO nova.compute.manager [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Took 4.69 seconds to build instance.
Nov 24 02:03:39 compute-0 nova_compute[186999]: 2025-11-24 02:03:39.532 187003 DEBUG oslo_concurrency.lockutils [None req-d3ae552c-efce-4b73-bcdb-8bdcc7d69583 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:39 compute-0 podman[217362]: 2025-11-24 02:03:39.724965846 +0000 UTC m=+0.061024180 container create e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:03:39 compute-0 systemd[1]: Started libpod-conmon-e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4.scope.
Nov 24 02:03:39 compute-0 podman[217362]: 2025-11-24 02:03:39.691780137 +0000 UTC m=+0.027838491 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:03:39 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:03:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bf870f1d0e21199c087c83381260931be31baadc97c1a831257698155e55ed4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:03:39 compute-0 podman[217362]: 2025-11-24 02:03:39.823013351 +0000 UTC m=+0.159071695 container init e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 02:03:39 compute-0 podman[217362]: 2025-11-24 02:03:39.828504195 +0000 UTC m=+0.164562539 container start e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 02:03:39 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [NOTICE]   (217381) : New worker (217383) forked
Nov 24 02:03:39 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [NOTICE]   (217381) : Loading success.
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.183 187003 DEBUG nova.compute.manager [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.185 187003 DEBUG oslo_concurrency.lockutils [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.185 187003 DEBUG oslo_concurrency.lockutils [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.186 187003 DEBUG oslo_concurrency.lockutils [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.186 187003 DEBUG nova.compute.manager [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] No waiting events found dispatching network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:41 compute-0 nova_compute[186999]: 2025-11-24 02:03:41.187 187003 WARNING nova.compute.manager [req-07c30a5e-90c1-4c28-ba56-f5b3a8e5ccbc req-c34abfbe-e6b2-437f-b80a-d81ae09771e6 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received unexpected event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with vm_state active and task_state None.
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.036 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.037 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.038 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.038 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.039 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.041 187003 INFO nova.compute.manager [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Terminating instance
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.043 187003 DEBUG nova.compute.manager [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:03:42 compute-0 kernel: tapa39e27b7-ff (unregistering): left promiscuous mode
Nov 24 02:03:42 compute-0 NetworkManager[55458]: <info>  [1763949822.0719] device (tapa39e27b7-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:03:42 compute-0 ovn_controller[95380]: 2025-11-24T02:03:42Z|00121|binding|INFO|Releasing lport a39e27b7-ff8f-4834-a397-2a7e27da88db from this chassis (sb_readonly=0)
Nov 24 02:03:42 compute-0 ovn_controller[95380]: 2025-11-24T02:03:42Z|00122|binding|INFO|Setting lport a39e27b7-ff8f-4834-a397-2a7e27da88db down in Southbound
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.083 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 ovn_controller[95380]: 2025-11-24T02:03:42Z|00123|binding|INFO|Removing iface tapa39e27b7-ff ovn-installed in OVS
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.090 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.094 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:03:4b 10.100.0.12'], port_security=['fa:16:3e:e1:03:4b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2f6f3f4f-82fd-4f26-96ef-89afb6dc811e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-343d4572-e2f0-409b-ab04-cec98c332a12', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-117439907', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '9', 'neutron:security_group_ids': '024c6ae6-4219-4646-a879-cfde045956dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.188', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f73b816f-fd8d-4071-9e47-7ee8bf6ad1c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=a39e27b7-ff8f-4834-a397-2a7e27da88db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.097 104238 INFO neutron.agent.ovn.metadata.agent [-] Port a39e27b7-ff8f-4834-a397-2a7e27da88db in datapath 343d4572-e2f0-409b-ab04-cec98c332a12 unbound from our chassis
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.098 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 343d4572-e2f0-409b-ab04-cec98c332a12, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.099 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[44656d56-97e5-4d82-b842-acfcf9a662a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.100 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 namespace which is not needed anymore
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.110 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 24 02:03:42 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.884s CPU time.
Nov 24 02:03:42 compute-0 systemd-machined[153319]: Machine qemu-9-instance-00000009 terminated.
Nov 24 02:03:42 compute-0 podman[217392]: 2025-11-24 02:03:42.199126095 +0000 UTC m=+0.086678328 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [NOTICE]   (217381) : haproxy version is 2.8.14-c23fe91
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [NOTICE]   (217381) : path to executable is /usr/sbin/haproxy
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [WARNING]  (217381) : Exiting Master process...
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [WARNING]  (217381) : Exiting Master process...
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [ALERT]    (217381) : Current worker (217383) exited with code 143 (Terminated)
Nov 24 02:03:42 compute-0 neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12[217377]: [WARNING]  (217381) : All workers exited. Exiting... (0)
Nov 24 02:03:42 compute-0 systemd[1]: libpod-e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4.scope: Deactivated successfully.
Nov 24 02:03:42 compute-0 conmon[217377]: conmon e134b962f18b9b4418ac <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4.scope/container/memory.events
Nov 24 02:03:42 compute-0 podman[217435]: 2025-11-24 02:03:42.25681481 +0000 UTC m=+0.046820852 container died e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.267 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.275 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4-userdata-shm.mount: Deactivated successfully.
Nov 24 02:03:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-4bf870f1d0e21199c087c83381260931be31baadc97c1a831257698155e55ed4-merged.mount: Deactivated successfully.
Nov 24 02:03:42 compute-0 podman[217435]: 2025-11-24 02:03:42.309572947 +0000 UTC m=+0.099578989 container cleanup e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.313 187003 INFO nova.virt.libvirt.driver [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Instance destroyed successfully.
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.313 187003 DEBUG nova.objects.instance [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:03:42 compute-0 systemd[1]: libpod-conmon-e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4.scope: Deactivated successfully.
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.323 187003 DEBUG nova.virt.libvirt.vif [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:03:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1565072121',display_name='tempest-TestNetworkBasicOps-server-1565072121',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1565072121',id=9,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIecYNXXzdra3Hp+3H0wXlNhREjEQhN3oVwZULmujJNKTM3eDfkMuVQGp0zVkHPRf0n7GeU4y9YuJ2u2FDO0pmLfVKKb6/RpO2jYcsmk1OcE6B7oHXX0jzFeu98ATvUOuQ==',key_name='tempest-TestNetworkBasicOps-934107515',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:03:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-p6iym5ua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:03:39Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=2f6f3f4f-82fd-4f26-96ef-89afb6dc811e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.323 187003 DEBUG nova.network.os_vif_util [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "address": "fa:16:3e:e1:03:4b", "network": {"id": "343d4572-e2f0-409b-ab04-cec98c332a12", "bridge": "br-int", "label": "tempest-network-smoke--1432448015", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e27b7-ff", "ovs_interfaceid": "a39e27b7-ff8f-4834-a397-2a7e27da88db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.324 187003 DEBUG nova.network.os_vif_util [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.324 187003 DEBUG os_vif [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.326 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.326 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa39e27b7-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.327 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.328 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.330 187003 INFO os_vif [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:03:4b,bridge_name='br-int',has_traffic_filtering=True,id=a39e27b7-ff8f-4834-a397-2a7e27da88db,network=Network(343d4572-e2f0-409b-ab04-cec98c332a12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa39e27b7-ff')
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.331 187003 INFO nova.virt.libvirt.driver [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Deleting instance files /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e_del
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.332 187003 INFO nova.virt.libvirt.driver [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Deletion of /var/lib/nova/instances/2f6f3f4f-82fd-4f26-96ef-89afb6dc811e_del complete
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.371 187003 INFO nova.compute.manager [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.372 187003 DEBUG oslo.service.loopingcall [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.372 187003 DEBUG nova.compute.manager [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.372 187003 DEBUG nova.network.neutron [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:03:42 compute-0 podman[217480]: 2025-11-24 02:03:42.380962146 +0000 UTC m=+0.045975688 container remove e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.386 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[9b566bac-9a94-4ea3-9db3-f25be0c1934f]: (4, ('Mon Nov 24 02:03:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 (e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4)\ne134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4\nMon Nov 24 02:03:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 (e134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4)\ne134b962f18b9b4418ac0ea08fab62bd8a7516e3fdfedac7fa7702fc6cc342c4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.388 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ea77e2f3-5622-46f1-b094-54b92cdd4f01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.389 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap343d4572-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.390 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 kernel: tap343d4572-e0: left promiscuous mode
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.403 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.407 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbe7f1b-e6d7-44e8-852a-0c344a11eac5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.425 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c664bc81-1850-4657-9282-6454ee6dd0b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.427 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d599bf4b-3a59-42dc-af27-e6cb37b92403]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.445 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc504af-fcf9-4143-b555-362065974d8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328566, 'reachable_time': 37003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217495, 'error': None, 'target': 'ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d343d4572\x2de2f0\x2d409b\x2dab04\x2dcec98c332a12.mount: Deactivated successfully.
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.449 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-343d4572-e2f0-409b-ab04-cec98c332a12 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:03:42 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:42.450 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[db578693-85b9-44c0-b35a-7c441040393e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:03:42 compute-0 nova_compute[186999]: 2025-11-24 02:03:42.547 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.255 187003 DEBUG nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.256 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.256 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.256 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.256 187003 DEBUG nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] No waiting events found dispatching network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.257 187003 DEBUG nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-vif-unplugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.257 187003 DEBUG nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.257 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.257 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.257 187003 DEBUG oslo_concurrency.lockutils [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.258 187003 DEBUG nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] No waiting events found dispatching network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:03:43 compute-0 nova_compute[186999]: 2025-11-24 02:03:43.258 187003 WARNING nova.compute.manager [req-511dcc3d-1682-4da8-b791-e6af44736cd6 req-e66e4472-5099-41f8-a030-7439bab9a2ec 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Received unexpected event network-vif-plugged-a39e27b7-ff8f-4834-a397-2a7e27da88db for instance with vm_state active and task_state deleting.
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.077 187003 DEBUG nova.network.neutron [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.094 187003 INFO nova.compute.manager [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Took 1.72 seconds to deallocate network for instance.
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.136 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.137 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.202 187003 DEBUG nova.compute.provider_tree [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.217 187003 DEBUG nova.scheduler.client.report [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.241 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.266 187003 INFO nova.scheduler.client.report [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e
Nov 24 02:03:44 compute-0 nova_compute[186999]: 2025-11-24 02:03:44.319 187003 DEBUG oslo_concurrency.lockutils [None req-bcae3ad8-203e-4446-876b-a06de12606cd e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "2f6f3f4f-82fd-4f26-96ef-89afb6dc811e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:45 compute-0 podman[217498]: 2025-11-24 02:03:45.821186406 +0000 UTC m=+0.075525295 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 24 02:03:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:46.746 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:03:46 compute-0 nova_compute[186999]: 2025-11-24 02:03:46.747 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:46 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:46.747 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:03:46 compute-0 podman[217518]: 2025-11-24 02:03:46.815993702 +0000 UTC m=+0.061454642 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:03:46 compute-0 podman[217519]: 2025-11-24 02:03:46.861475796 +0000 UTC m=+0.102726908 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 02:03:46 compute-0 sshd-session[217496]: Invalid user sftp from 154.90.59.75 port 40282
Nov 24 02:03:47 compute-0 sshd-session[217496]: Received disconnect from 154.90.59.75 port 40282:11: Bye Bye [preauth]
Nov 24 02:03:47 compute-0 sshd-session[217496]: Disconnected from invalid user sftp 154.90.59.75 port 40282 [preauth]
Nov 24 02:03:47 compute-0 nova_compute[186999]: 2025-11-24 02:03:47.329 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:47 compute-0 nova_compute[186999]: 2025-11-24 02:03:47.550 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:48.425 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:03:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:48.425 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:03:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:48.426 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:03:49 compute-0 nova_compute[186999]: 2025-11-24 02:03:49.974 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:50 compute-0 nova_compute[186999]: 2025-11-24 02:03:50.064 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:52 compute-0 nova_compute[186999]: 2025-11-24 02:03:52.334 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:52 compute-0 nova_compute[186999]: 2025-11-24 02:03:52.552 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:54 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:03:54.750 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:03:57 compute-0 nova_compute[186999]: 2025-11-24 02:03:57.312 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949822.3110123, 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:03:57 compute-0 nova_compute[186999]: 2025-11-24 02:03:57.313 187003 INFO nova.compute.manager [-] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] VM Stopped (Lifecycle Event)
Nov 24 02:03:57 compute-0 nova_compute[186999]: 2025-11-24 02:03:57.328 187003 DEBUG nova.compute.manager [None req-8f77b5f6-9612-4484-8c36-e1c9c73fae95 - - - - - -] [instance: 2f6f3f4f-82fd-4f26-96ef-89afb6dc811e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:03:57 compute-0 nova_compute[186999]: 2025-11-24 02:03:57.338 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:57 compute-0 nova_compute[186999]: 2025-11-24 02:03:57.554 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:03:57 compute-0 podman[217568]: 2025-11-24 02:03:57.832451147 +0000 UTC m=+0.076685319 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 24 02:04:02 compute-0 nova_compute[186999]: 2025-11-24 02:04:02.342 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:02 compute-0 nova_compute[186999]: 2025-11-24 02:04:02.556 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:02 compute-0 podman[217589]: 2025-11-24 02:04:02.868032348 +0000 UTC m=+0.117249014 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Nov 24 02:04:02 compute-0 nova_compute[186999]: 2025-11-24 02:04:02.947 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:02 compute-0 nova_compute[186999]: 2025-11-24 02:04:02.948 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:02 compute-0 nova_compute[186999]: 2025-11-24 02:04:02.967 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.046 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.046 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.054 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.054 187003 INFO nova.compute.claims [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.160 187003 DEBUG nova.compute.provider_tree [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.171 187003 DEBUG nova.scheduler.client.report [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.188 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.189 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.242 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.243 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.257 187003 INFO nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.274 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.348 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.349 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.350 187003 INFO nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Creating image(s)
Nov 24 02:04:03 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.350 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:03 compute-0 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.351 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.352 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.368 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.426 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.428 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.429 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.454 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.510 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.512 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.561 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.562 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.563 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.620 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.622 187003 DEBUG nova.virt.disk.api [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.622 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.680 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.681 187003 DEBUG nova.virt.disk.api [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.681 187003 DEBUG nova.objects.instance [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 93afa27f-f795-4a07-be0e-c1938d1a50b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.691 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.691 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Ensure instance console log exists: /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.692 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.692 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.692 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:03 compute-0 nova_compute[186999]: 2025-11-24 02:04:03.697 187003 DEBUG nova.policy [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:04:04 compute-0 nova_compute[186999]: 2025-11-24 02:04:04.277 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Successfully created port: 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.147 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Successfully updated port: 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.160 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.160 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.160 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.224 187003 DEBUG nova.compute.manager [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.225 187003 DEBUG nova.compute.manager [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing instance network info cache due to event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.225 187003 DEBUG oslo_concurrency.lockutils [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:05 compute-0 nova_compute[186999]: 2025-11-24 02:04:05.286 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.393 187003 DEBUG nova.network.neutron [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.438 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.438 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Instance network_info: |[{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.439 187003 DEBUG oslo_concurrency.lockutils [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.439 187003 DEBUG nova.network.neutron [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.445 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Start _get_guest_xml network_info=[{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.451 187003 WARNING nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.476 187003 DEBUG nova.virt.libvirt.host [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.477 187003 DEBUG nova.virt.libvirt.host [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.484 187003 DEBUG nova.virt.libvirt.host [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.485 187003 DEBUG nova.virt.libvirt.host [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.485 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.486 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.486 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.486 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.486 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.487 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.488 187003 DEBUG nova.virt.hardware [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.492 187003 DEBUG nova.virt.libvirt.vif [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:04:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2080061443',display_name='tempest-TestNetworkBasicOps-server-2080061443',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2080061443',id=10,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJON7LY9f5TScy7Sc7Q52alM91PMnXIwkbTudGk0Ty89/AMy09VCM0oEgiZJrKWLoVd4A3nATuoJ/6iYXbo1oRBtNdz38VsFW+MkAM5ubbuhhaq9G5lCoP54AJmsnChSOQ==',key_name='tempest-TestNetworkBasicOps-380934439',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ld9kw0fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:04:03Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=93afa27f-f795-4a07-be0e-c1938d1a50b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.493 187003 DEBUG nova.network.os_vif_util [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.494 187003 DEBUG nova.network.os_vif_util [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.495 187003 DEBUG nova.objects.instance [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 93afa27f-f795-4a07-be0e-c1938d1a50b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.508 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <uuid>93afa27f-f795-4a07-be0e-c1938d1a50b5</uuid>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <name>instance-0000000a</name>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-2080061443</nova:name>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:04:06</nova:creationTime>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         <nova:port uuid="7ad6f1f2-7185-4de9-8ff9-4a1af5291276">
Nov 24 02:04:06 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <system>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="serial">93afa27f-f795-4a07-be0e-c1938d1a50b5</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="uuid">93afa27f-f795-4a07-be0e-c1938d1a50b5</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </system>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <os>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </os>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <features>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </features>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.config"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:48:e6:1f"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <target dev="tap7ad6f1f2-71"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/console.log" append="off"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <video>
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </video>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:04:06 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:04:06 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:04:06 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:04:06 compute-0 nova_compute[186999]: </domain>
Nov 24 02:04:06 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.509 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Preparing to wait for external event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.510 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.510 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.510 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.511 187003 DEBUG nova.virt.libvirt.vif [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:04:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2080061443',display_name='tempest-TestNetworkBasicOps-server-2080061443',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2080061443',id=10,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJON7LY9f5TScy7Sc7Q52alM91PMnXIwkbTudGk0Ty89/AMy09VCM0oEgiZJrKWLoVd4A3nATuoJ/6iYXbo1oRBtNdz38VsFW+MkAM5ubbuhhaq9G5lCoP54AJmsnChSOQ==',key_name='tempest-TestNetworkBasicOps-380934439',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ld9kw0fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:04:03Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=93afa27f-f795-4a07-be0e-c1938d1a50b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.511 187003 DEBUG nova.network.os_vif_util [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.512 187003 DEBUG nova.network.os_vif_util [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.512 187003 DEBUG os_vif [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.513 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.514 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.514 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.518 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.519 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ad6f1f2-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.520 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ad6f1f2-71, col_values=(('external_ids', {'iface-id': '7ad6f1f2-7185-4de9-8ff9-4a1af5291276', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:e6:1f', 'vm-uuid': '93afa27f-f795-4a07-be0e-c1938d1a50b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.521 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:06 compute-0 NetworkManager[55458]: <info>  [1763949846.5220] manager: (tap7ad6f1f2-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.524 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.530 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.531 187003 INFO os_vif [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71')
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.602 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.603 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.604 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:48:e6:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:04:06 compute-0 nova_compute[186999]: 2025-11-24 02:04:06.604 187003 INFO nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Using config drive
Nov 24 02:04:07 compute-0 nova_compute[186999]: 2025-11-24 02:04:07.558 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.713 187003 INFO nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Creating config drive at /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.config
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.719 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfv70mfhn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.847 187003 DEBUG oslo_concurrency.processutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfv70mfhn" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:08 compute-0 podman[217630]: 2025-11-24 02:04:08.856708428 +0000 UTC m=+0.094172628 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:04:08 compute-0 kernel: tap7ad6f1f2-71: entered promiscuous mode
Nov 24 02:04:08 compute-0 NetworkManager[55458]: <info>  [1763949848.9040] manager: (tap7ad6f1f2-71): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Nov 24 02:04:08 compute-0 ovn_controller[95380]: 2025-11-24T02:04:08Z|00124|binding|INFO|Claiming lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 for this chassis.
Nov 24 02:04:08 compute-0 ovn_controller[95380]: 2025-11-24T02:04:08Z|00125|binding|INFO|7ad6f1f2-7185-4de9-8ff9-4a1af5291276: Claiming fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.904 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.909 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.918 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.924 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:e6:1f 10.100.0.11'], port_security=['fa:16:3e:48:e6:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '93afa27f-f795-4a07-be0e-c1938d1a50b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '097a3b2d-f43f-4a8d-8bcd-d8da40f52a64', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01d64d0-d444-442d-9b39-c73128b8bdf2, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=7ad6f1f2-7185-4de9-8ff9-4a1af5291276) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.925 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 in datapath 03ed51f6-ceda-464c-9e3c-c68e5c559a84 bound to our chassis
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.926 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 03ed51f6-ceda-464c-9e3c-c68e5c559a84
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.938 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d367b14c-8c7e-4d45-a55a-c3638df9a0eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.939 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap03ed51f6-c1 in ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:04:08 compute-0 systemd-udevd[217673]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.941 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap03ed51f6-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.941 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[af45845b-a72b-41b9-afd3-87cbf11381d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.942 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc503bc-ed81-40e2-a380-2a4edfbe8b93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:08 compute-0 systemd-machined[153319]: New machine qemu-10-instance-0000000a.
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.952 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[de6bd9fc-3279-4f1b-b093-8ea120ad755d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:08 compute-0 NetworkManager[55458]: <info>  [1763949848.9568] device (tap7ad6f1f2-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:04:08 compute-0 NetworkManager[55458]: <info>  [1763949848.9577] device (tap7ad6f1f2-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:04:08 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:08.978 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0fd27c-99e4-47d4-bec9-1286eb0f98e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:08 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.988 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:08 compute-0 ovn_controller[95380]: 2025-11-24T02:04:08Z|00126|binding|INFO|Setting lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 ovn-installed in OVS
Nov 24 02:04:08 compute-0 ovn_controller[95380]: 2025-11-24T02:04:08Z|00127|binding|INFO|Setting lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 up in Southbound
Nov 24 02:04:08 compute-0 nova_compute[186999]: 2025-11-24 02:04:08.991 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.008 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e348dbf8-e79c-4946-8274-a699c801dd91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 systemd-udevd[217677]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:04:09 compute-0 NetworkManager[55458]: <info>  [1763949849.0149] manager: (tap03ed51f6-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.014 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe5c9c4-516b-476a-b3bb-0d4a0bdb7d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.051 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9255d8-3655-4422-9fef-7c174c2ec577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.054 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[be5b1207-518f-45ce-b9d2-cdacc7cdcd1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 NetworkManager[55458]: <info>  [1763949849.0785] device (tap03ed51f6-c0): carrier: link connected
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.086 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[0b35de90-9825-474c-8683-bc8e3bd37939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.112 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7cfc0acc-b0fe-4e4f-acc7-cbbee5f478c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap03ed51f6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:cb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331574, 'reachable_time': 41405, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217706, 'error': None, 'target': 'ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.128 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[31c80e4a-bd19-49f6-a481-05b644f155c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe97:cbc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331574, 'tstamp': 331574}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217707, 'error': None, 'target': 'ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.143 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d75a8a3b-d20b-4bd6-885b-f90b663910f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap03ed51f6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:cb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331574, 'reachable_time': 41405, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217708, 'error': None, 'target': 'ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.170 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac3b711-aa70-4cc3-af97-98f6ed359983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.227 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e909f5cf-47f0-4374-bfa0-a21d922370d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.229 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03ed51f6-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.230 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.231 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03ed51f6-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.279 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 kernel: tap03ed51f6-c0: entered promiscuous mode
Nov 24 02:04:09 compute-0 NetworkManager[55458]: <info>  [1763949849.2806] manager: (tap03ed51f6-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.281 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.283 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap03ed51f6-c0, col_values=(('external_ids', {'iface-id': '1e56ac70-cccd-4d19-8899-e1e1de142103'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.285 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 ovn_controller[95380]: 2025-11-24T02:04:09Z|00128|binding|INFO|Releasing lport 1e56ac70-cccd-4d19-8899-e1e1de142103 from this chassis (sb_readonly=0)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.285 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.288 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/03ed51f6-ceda-464c-9e3c-c68e5c559a84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/03ed51f6-ceda-464c-9e3c-c68e5c559a84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.289 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0a90d643-c880-4e58-8c09-da9b06f53f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.290 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-03ed51f6-ceda-464c-9e3c-c68e5c559a84
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/03ed51f6-ceda-464c-9e3c-c68e5c559a84.pid.haproxy
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 03ed51f6-ceda-464c-9e3c-c68e5c559a84
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:04:09 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:09.292 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'env', 'PROCESS_TAG=haproxy-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/03ed51f6-ceda-464c-9e3c-c68e5c559a84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.296 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.365 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949849.3649466, 93afa27f-f795-4a07-be0e-c1938d1a50b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.366 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] VM Started (Lifecycle Event)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.393 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.399 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949849.3655457, 93afa27f-f795-4a07-be0e-c1938d1a50b5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.399 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] VM Paused (Lifecycle Event)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.417 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.422 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.443 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:04:09 compute-0 podman[217747]: 2025-11-24 02:04:09.688419377 +0000 UTC m=+0.062976665 container create b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:04:09 compute-0 systemd[1]: Started libpod-conmon-b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72.scope.
Nov 24 02:04:09 compute-0 podman[217747]: 2025-11-24 02:04:09.659574239 +0000 UTC m=+0.034131547 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:04:09 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:04:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cad08929e4b3e33390717969f5147c529f24b4bb966519bdf6b6f92f47f4dbc4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:09 compute-0 podman[217747]: 2025-11-24 02:04:09.780694641 +0000 UTC m=+0.155252029 container init b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 02:04:09 compute-0 podman[217747]: 2025-11-24 02:04:09.786697039 +0000 UTC m=+0.161254367 container start b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:04:09 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [NOTICE]   (217766) : New worker (217768) forked
Nov 24 02:04:09 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [NOTICE]   (217766) : Loading success.
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.903 187003 DEBUG nova.compute.manager [req-b56f59cf-d73a-479f-a18b-2509166d005e req-e2e0d1c1-6cec-4ca3-9f95-91bad57d0e84 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.904 187003 DEBUG oslo_concurrency.lockutils [req-b56f59cf-d73a-479f-a18b-2509166d005e req-e2e0d1c1-6cec-4ca3-9f95-91bad57d0e84 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.904 187003 DEBUG oslo_concurrency.lockutils [req-b56f59cf-d73a-479f-a18b-2509166d005e req-e2e0d1c1-6cec-4ca3-9f95-91bad57d0e84 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.905 187003 DEBUG oslo_concurrency.lockutils [req-b56f59cf-d73a-479f-a18b-2509166d005e req-e2e0d1c1-6cec-4ca3-9f95-91bad57d0e84 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.905 187003 DEBUG nova.compute.manager [req-b56f59cf-d73a-479f-a18b-2509166d005e req-e2e0d1c1-6cec-4ca3-9f95-91bad57d0e84 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Processing event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.907 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.916 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949849.9131975, 93afa27f-f795-4a07-be0e-c1938d1a50b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.916 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] VM Resumed (Lifecycle Event)
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.920 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.926 187003 INFO nova.virt.libvirt.driver [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Instance spawned successfully.
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.927 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.941 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.951 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.956 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.956 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.957 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.957 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.958 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.959 187003 DEBUG nova.virt.libvirt.driver [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:09 compute-0 nova_compute[186999]: 2025-11-24 02:04:09.988 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:04:10 compute-0 nova_compute[186999]: 2025-11-24 02:04:10.031 187003 INFO nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Took 6.68 seconds to spawn the instance on the hypervisor.
Nov 24 02:04:10 compute-0 nova_compute[186999]: 2025-11-24 02:04:10.032 187003 DEBUG nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:10 compute-0 nova_compute[186999]: 2025-11-24 02:04:10.109 187003 INFO nova.compute.manager [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Took 7.09 seconds to build instance.
Nov 24 02:04:10 compute-0 nova_compute[186999]: 2025-11-24 02:04:10.123 187003 DEBUG oslo_concurrency.lockutils [None req-3e73abf3-45a1-4cb3-9229-f340a89b154d e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.329 187003 DEBUG nova.network.neutron [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updated VIF entry in instance network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.329 187003 DEBUG nova.network.neutron [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.349 187003 DEBUG oslo_concurrency.lockutils [req-f2d19c54-f555-4aa9-a4fb-9b29c4e726b6 req-e2cbfe18-5832-40d8-bc85-fc3bbb3c2ba0 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.549 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.924 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.924 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.925 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.925 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 93afa27f-f795-4a07-be0e-c1938d1a50b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.989 187003 DEBUG nova.compute.manager [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.990 187003 DEBUG oslo_concurrency.lockutils [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.990 187003 DEBUG oslo_concurrency.lockutils [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.991 187003 DEBUG oslo_concurrency.lockutils [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.991 187003 DEBUG nova.compute.manager [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] No waiting events found dispatching network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:04:11 compute-0 nova_compute[186999]: 2025-11-24 02:04:11.992 187003 WARNING nova.compute.manager [req-9555c805-f39d-4a5a-8d01-9a9ec6aa8614 req-c4bfb9f9-8b85-433c-97b3-f95d054ef45f 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received unexpected event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 for instance with vm_state active and task_state None.
Nov 24 02:04:12 compute-0 nova_compute[186999]: 2025-11-24 02:04:12.560 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:12 compute-0 podman[217777]: 2025-11-24 02:04:12.809599313 +0000 UTC m=+0.058741496 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:04:14 compute-0 nova_compute[186999]: 2025-11-24 02:04:14.767 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.182 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.183 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.183 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:15 compute-0 nova_compute[186999]: 2025-11-24 02:04:15.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:16 compute-0 NetworkManager[55458]: <info>  [1763949856.1182] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 24 02:04:16 compute-0 NetworkManager[55458]: <info>  [1763949856.1194] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 24 02:04:16 compute-0 ovn_controller[95380]: 2025-11-24T02:04:16Z|00129|binding|INFO|Releasing lport 1e56ac70-cccd-4d19-8899-e1e1de142103 from this chassis (sb_readonly=0)
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.118 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.157 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:16 compute-0 ovn_controller[95380]: 2025-11-24T02:04:16Z|00130|binding|INFO|Releasing lport 1e56ac70-cccd-4d19-8899-e1e1de142103 from this chassis (sb_readonly=0)
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.163 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.325 187003 DEBUG nova.compute.manager [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.325 187003 DEBUG nova.compute.manager [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing instance network info cache due to event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.325 187003 DEBUG oslo_concurrency.lockutils [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.326 187003 DEBUG oslo_concurrency.lockutils [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.326 187003 DEBUG nova.network.neutron [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.555 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.792 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.792 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.792 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.793 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:04:16 compute-0 podman[217796]: 2025-11-24 02:04:16.831205283 +0000 UTC m=+0.079651141 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.856 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:16 compute-0 podman[217817]: 2025-11-24 02:04:16.913989541 +0000 UTC m=+0.057222533 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.928 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.929 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:16 compute-0 nova_compute[186999]: 2025-11-24 02:04:16.987 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:17 compute-0 podman[217842]: 2025-11-24 02:04:17.069097874 +0000 UTC m=+0.125197146 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.152 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.153 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=73.45925903320312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.153 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.154 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.235 187003 DEBUG nova.network.neutron [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updated VIF entry in instance network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.237 187003 DEBUG nova.network.neutron [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.247 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 93afa27f-f795-4a07-be0e-c1938d1a50b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.247 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.248 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.260 187003 DEBUG oslo_concurrency.lockutils [req-71f86e04-85b7-4cc4-8f51-d187faa0c19e req-bb694aa8-cb5f-4bb1-9b3a-bd27724c38fa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.294 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.307 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.326 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.327 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:17 compute-0 nova_compute[186999]: 2025-11-24 02:04:17.569 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:17 compute-0 sshd-session[217873]: Invalid user exx from 46.188.119.26 port 37794
Nov 24 02:04:17 compute-0 sshd-session[217873]: Received disconnect from 46.188.119.26 port 37794:11: Bye Bye [preauth]
Nov 24 02:04:17 compute-0 sshd-session[217873]: Disconnected from invalid user exx 46.188.119.26 port 37794 [preauth]
Nov 24 02:04:20 compute-0 nova_compute[186999]: 2025-11-24 02:04:20.328 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:04:20 compute-0 nova_compute[186999]: 2025-11-24 02:04:20.329 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:04:21 compute-0 nova_compute[186999]: 2025-11-24 02:04:21.563 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:21 compute-0 ovn_controller[95380]: 2025-11-24T02:04:21Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:21 compute-0 ovn_controller[95380]: 2025-11-24T02:04:21Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:22 compute-0 nova_compute[186999]: 2025-11-24 02:04:22.569 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:26 compute-0 nova_compute[186999]: 2025-11-24 02:04:26.568 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:27 compute-0 nova_compute[186999]: 2025-11-24 02:04:27.570 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:28 compute-0 nova_compute[186999]: 2025-11-24 02:04:28.671 187003 INFO nova.compute.manager [None req-de30a5ba-d24d-4b35-bea0-84beabb08251 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Get console output
Nov 24 02:04:28 compute-0 nova_compute[186999]: 2025-11-24 02:04:28.681 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:04:28 compute-0 podman[217892]: 2025-11-24 02:04:28.817655307 +0000 UTC m=+0.069976000 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 02:04:31 compute-0 ovn_controller[95380]: 2025-11-24T02:04:31Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:31 compute-0 nova_compute[186999]: 2025-11-24 02:04:31.572 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:32 compute-0 nova_compute[186999]: 2025-11-24 02:04:32.574 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:33 compute-0 podman[217912]: 2025-11-24 02:04:33.869036571 +0000 UTC m=+0.112304526 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 24 02:04:33 compute-0 ovn_controller[95380]: 2025-11-24T02:04:33Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.136 187003 DEBUG nova.compute.manager [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.137 187003 DEBUG nova.compute.manager [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing instance network info cache due to event network-changed-7ad6f1f2-7185-4de9-8ff9-4a1af5291276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.138 187003 DEBUG oslo_concurrency.lockutils [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.138 187003 DEBUG oslo_concurrency.lockutils [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.138 187003 DEBUG nova.network.neutron [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Refreshing network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.216 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.216 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.217 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.217 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.217 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.220 187003 INFO nova.compute.manager [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Terminating instance
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.221 187003 DEBUG nova.compute.manager [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:04:34 compute-0 kernel: tap7ad6f1f2-71 (unregistering): left promiscuous mode
Nov 24 02:04:34 compute-0 NetworkManager[55458]: <info>  [1763949874.2516] device (tap7ad6f1f2-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00131|binding|INFO|Releasing lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 from this chassis (sb_readonly=0)
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00132|binding|INFO|Setting lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 down in Southbound
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.266 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00133|binding|INFO|Removing iface tap7ad6f1f2-71 ovn-installed in OVS
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.268 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.276 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:e6:1f 10.100.0.11'], port_security=['fa:16:3e:48:e6:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '93afa27f-f795-4a07-be0e-c1938d1a50b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '097a3b2d-f43f-4a8d-8bcd-d8da40f52a64', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01d64d0-d444-442d-9b39-c73128b8bdf2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=7ad6f1f2-7185-4de9-8ff9-4a1af5291276) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.279 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 in datapath 03ed51f6-ceda-464c-9e3c-c68e5c559a84 unbound from our chassis
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.280 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 03ed51f6-ceda-464c-9e3c-c68e5c559a84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.286 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e70d0216-7841-46df-ac34-56791a5fd399]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.289 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84 namespace which is not needed anymore
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.292 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 24 02:04:34 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 13.079s CPU time.
Nov 24 02:04:34 compute-0 systemd-machined[153319]: Machine qemu-10-instance-0000000a terminated.
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [NOTICE]   (217766) : haproxy version is 2.8.14-c23fe91
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [NOTICE]   (217766) : path to executable is /usr/sbin/haproxy
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [WARNING]  (217766) : Exiting Master process...
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [WARNING]  (217766) : Exiting Master process...
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [ALERT]    (217766) : Current worker (217768) exited with code 143 (Terminated)
Nov 24 02:04:34 compute-0 neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84[217762]: [WARNING]  (217766) : All workers exited. Exiting... (0)
Nov 24 02:04:34 compute-0 kernel: tap7ad6f1f2-71: entered promiscuous mode
Nov 24 02:04:34 compute-0 kernel: tap7ad6f1f2-71 (unregistering): left promiscuous mode
Nov 24 02:04:34 compute-0 systemd[1]: libpod-b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72.scope: Deactivated successfully.
Nov 24 02:04:34 compute-0 podman[217962]: 2025-11-24 02:04:34.458368523 +0000 UTC m=+0.057838951 container died b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.465 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00134|binding|INFO|Claiming lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 for this chassis.
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00135|binding|INFO|7ad6f1f2-7185-4de9-8ff9-4a1af5291276: Claiming fa:16:3e:48:e6:1f 10.100.0.11
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.481 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:e6:1f 10.100.0.11'], port_security=['fa:16:3e:48:e6:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '93afa27f-f795-4a07-be0e-c1938d1a50b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '097a3b2d-f43f-4a8d-8bcd-d8da40f52a64', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01d64d0-d444-442d-9b39-c73128b8bdf2, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=7ad6f1f2-7185-4de9-8ff9-4a1af5291276) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:34 compute-0 ovn_controller[95380]: 2025-11-24T02:04:34Z|00136|binding|INFO|Releasing lport 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 from this chassis (sb_readonly=0)
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.495 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.504 187003 INFO nova.virt.libvirt.driver [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Instance destroyed successfully.
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.504 187003 DEBUG nova.objects.instance [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 93afa27f-f795-4a07-be0e-c1938d1a50b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.504 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:e6:1f 10.100.0.11'], port_security=['fa:16:3e:48:e6:1f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '93afa27f-f795-4a07-be0e-c1938d1a50b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '097a3b2d-f43f-4a8d-8bcd-d8da40f52a64', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01d64d0-d444-442d-9b39-c73128b8bdf2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=7ad6f1f2-7185-4de9-8ff9-4a1af5291276) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72-userdata-shm.mount: Deactivated successfully.
Nov 24 02:04:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-cad08929e4b3e33390717969f5147c529f24b4bb966519bdf6b6f92f47f4dbc4-merged.mount: Deactivated successfully.
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.522 187003 DEBUG nova.virt.libvirt.vif [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:04:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2080061443',display_name='tempest-TestNetworkBasicOps-server-2080061443',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2080061443',id=10,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJON7LY9f5TScy7Sc7Q52alM91PMnXIwkbTudGk0Ty89/AMy09VCM0oEgiZJrKWLoVd4A3nATuoJ/6iYXbo1oRBtNdz38VsFW+MkAM5ubbuhhaq9G5lCoP54AJmsnChSOQ==',key_name='tempest-TestNetworkBasicOps-380934439',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:04:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-ld9kw0fx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:04:10Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=93afa27f-f795-4a07-be0e-c1938d1a50b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.523 187003 DEBUG nova.network.os_vif_util [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.524 187003 DEBUG nova.network.os_vif_util [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.524 187003 DEBUG os_vif [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.526 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.526 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ad6f1f2-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.528 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 podman[217962]: 2025-11-24 02:04:34.530223225 +0000 UTC m=+0.129693633 container cleanup b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.531 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.536 187003 INFO os_vif [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:e6:1f,bridge_name='br-int',has_traffic_filtering=True,id=7ad6f1f2-7185-4de9-8ff9-4a1af5291276,network=Network(03ed51f6-ceda-464c-9e3c-c68e5c559a84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad6f1f2-71')
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.537 187003 INFO nova.virt.libvirt.driver [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Deleting instance files /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5_del
Nov 24 02:04:34 compute-0 systemd[1]: libpod-conmon-b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72.scope: Deactivated successfully.
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.538 187003 INFO nova.virt.libvirt.driver [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Deletion of /var/lib/nova/instances/93afa27f-f795-4a07-be0e-c1938d1a50b5_del complete
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.580 187003 INFO nova.compute.manager [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.581 187003 DEBUG oslo.service.loopingcall [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.582 187003 DEBUG nova.compute.manager [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.582 187003 DEBUG nova.network.neutron [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:04:34 compute-0 podman[218009]: 2025-11-24 02:04:34.602670904 +0000 UTC m=+0.044624931 container remove b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.611 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[cf644e4f-e0bd-4688-a266-647b552cd9a1]: (4, ('Mon Nov 24 02:04:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84 (b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72)\nb9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72\nMon Nov 24 02:04:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84 (b9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72)\nb9ec69cbb05f1db593e0e47b125ff573cc867d6c9726ed54f0a3d4f3733d8f72\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.613 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[007278c0-cfee-4c2b-b64f-8fb024eecd96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.615 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03ed51f6-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.618 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 kernel: tap03ed51f6-c0: left promiscuous mode
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.620 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.623 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[59b6bc4a-326b-41cc-9978-3c90689e2210]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 nova_compute[186999]: 2025-11-24 02:04:34.633 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.648 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f591e45a-6f2d-4b77-96bb-d13481cc003b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.650 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[8997ed1a-0673-42ab-9065-d50900097851]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.666 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4989e2-72b1-4e7b-85fe-7739d3b500f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331566, 'reachable_time': 29993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218022, 'error': None, 'target': 'ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.669 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-03ed51f6-ceda-464c-9e3c-c68e5c559a84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.669 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce5fe0c-bd75-466d-a087-efc521676e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d03ed51f6\x2dceda\x2d464c\x2d9e3c\x2dc68e5c559a84.mount: Deactivated successfully.
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.670 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 in datapath 03ed51f6-ceda-464c-9e3c-c68e5c559a84 unbound from our chassis
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.671 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 03ed51f6-ceda-464c-9e3c-c68e5c559a84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.672 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a55b52-402d-463c-b91e-b0b5efdd9e90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.672 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276 in datapath 03ed51f6-ceda-464c-9e3c-c68e5c559a84 unbound from our chassis
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.673 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 03ed51f6-ceda-464c-9e3c-c68e5c559a84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:04:34 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:34.674 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[5e03b5db-027f-425b-a3f7-3be9526211a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.126 187003 DEBUG nova.network.neutron [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.144 187003 INFO nova.compute.manager [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Took 0.56 seconds to deallocate network for instance.
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.187 187003 DEBUG nova.network.neutron [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updated VIF entry in instance network info cache for port 7ad6f1f2-7185-4de9-8ff9-4a1af5291276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.188 187003 DEBUG nova.network.neutron [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Updating instance_info_cache with network_info: [{"id": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "address": "fa:16:3e:48:e6:1f", "network": {"id": "03ed51f6-ceda-464c-9e3c-c68e5c559a84", "bridge": "br-int", "label": "tempest-network-smoke--765527605", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad6f1f2-71", "ovs_interfaceid": "7ad6f1f2-7185-4de9-8ff9-4a1af5291276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.202 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.202 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.211 187003 DEBUG nova.compute.manager [req-db90dd6b-c4ab-48e1-8b56-68ef9060faf0 req-bdef75e7-9537-4335-ae6a-77718de829da 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-vif-deleted-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.213 187003 DEBUG oslo_concurrency.lockutils [req-7c7b10f6-ee81-41d5-a761-d4bc3db24c60 req-56757768-1f73-486a-9c8c-39b10c858d03 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-93afa27f-f795-4a07-be0e-c1938d1a50b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.276 187003 DEBUG nova.compute.provider_tree [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.288 187003 DEBUG nova.scheduler.client.report [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.310 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.335 187003 INFO nova.scheduler.client.report [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 93afa27f-f795-4a07-be0e-c1938d1a50b5
Nov 24 02:04:35 compute-0 nova_compute[186999]: 2025-11-24 02:04:35.384 187003 DEBUG oslo_concurrency.lockutils [None req-c5c0a9b3-92e2-4a6c-b329-41e06184fe27 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.218 187003 DEBUG nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-vif-unplugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.218 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.219 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.219 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.219 187003 DEBUG nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] No waiting events found dispatching network-vif-unplugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.219 187003 WARNING nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received unexpected event network-vif-unplugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 for instance with vm_state deleted and task_state None.
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 DEBUG nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 DEBUG oslo_concurrency.lockutils [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "93afa27f-f795-4a07-be0e-c1938d1a50b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 DEBUG nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] No waiting events found dispatching network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:04:36 compute-0 nova_compute[186999]: 2025-11-24 02:04:36.220 187003 WARNING nova.compute.manager [req-c1caa5a0-8db8-4e5d-a95a-fca5b9c7d235 req-5e65c24b-f168-4a5b-9d66-f31c90263a73 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Received unexpected event network-vif-plugged-7ad6f1f2-7185-4de9-8ff9-4a1af5291276 for instance with vm_state deleted and task_state None.
Nov 24 02:04:37 compute-0 nova_compute[186999]: 2025-11-24 02:04:37.576 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:39 compute-0 nova_compute[186999]: 2025-11-24 02:04:39.324 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:39 compute-0 nova_compute[186999]: 2025-11-24 02:04:39.394 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:39 compute-0 nova_compute[186999]: 2025-11-24 02:04:39.529 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:39 compute-0 podman[218024]: 2025-11-24 02:04:39.817809473 +0000 UTC m=+0.063110139 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:04:42 compute-0 nova_compute[186999]: 2025-11-24 02:04:42.577 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:43 compute-0 podman[218048]: 2025-11-24 02:04:43.796049178 +0000 UTC m=+0.050050733 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 02:04:44 compute-0 nova_compute[186999]: 2025-11-24 02:04:44.533 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:47 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:47.525 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:47 compute-0 nova_compute[186999]: 2025-11-24 02:04:47.526 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:47 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:47.527 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:04:47 compute-0 nova_compute[186999]: 2025-11-24 02:04:47.625 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:47 compute-0 podman[218068]: 2025-11-24 02:04:47.810177878 +0000 UTC m=+0.058306354 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:04:47 compute-0 podman[218067]: 2025-11-24 02:04:47.811128115 +0000 UTC m=+0.066833153 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:04:47 compute-0 podman[218069]: 2025-11-24 02:04:47.873870732 +0000 UTC m=+0.119182379 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 02:04:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:48.426 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:48.427 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:49 compute-0 nova_compute[186999]: 2025-11-24 02:04:49.502 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949874.501055, 93afa27f-f795-4a07-be0e-c1938d1a50b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:49 compute-0 nova_compute[186999]: 2025-11-24 02:04:49.503 187003 INFO nova.compute.manager [-] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] VM Stopped (Lifecycle Event)
Nov 24 02:04:49 compute-0 nova_compute[186999]: 2025-11-24 02:04:49.518 187003 DEBUG nova.compute.manager [None req-74603657-3877-43c9-b456-1a95f69dc5cf - - - - - -] [instance: 93afa27f-f795-4a07-be0e-c1938d1a50b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:49 compute-0 nova_compute[186999]: 2025-11-24 02:04:49.536 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:52 compute-0 nova_compute[186999]: 2025-11-24 02:04:52.627 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.019 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.020 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.038 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.098 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.099 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.108 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.109 187003 INFO nova.compute.claims [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.215 187003 DEBUG nova.compute.provider_tree [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.226 187003 DEBUG nova.scheduler.client.report [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.245 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.246 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.284 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.285 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.299 187003 INFO nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.310 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.408 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.409 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.409 187003 INFO nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Creating image(s)
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.410 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.410 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.411 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.422 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.493 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.495 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.495 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.510 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.569 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.571 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.625 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.626 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.627 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.682 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.684 187003 DEBUG nova.virt.disk.api [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.685 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.708 187003 DEBUG nova.policy [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.749 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.750 187003 DEBUG nova.virt.disk.api [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.750 187003 DEBUG nova.objects.instance [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid b870f828-e429-4acb-8457-dd2521c13114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.762 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.762 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Ensure instance console log exists: /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.763 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.763 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:53 compute-0 nova_compute[186999]: 2025-11-24 02:04:53.763 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:54 compute-0 nova_compute[186999]: 2025-11-24 02:04:54.406 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Successfully created port: 4fbe252d-e231-4421-9a71-f8470765731a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:04:54 compute-0 nova_compute[186999]: 2025-11-24 02:04:54.540 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.217 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Successfully updated port: 4fbe252d-e231-4421-9a71-f8470765731a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.229 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.230 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.230 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.292 187003 DEBUG nova.compute.manager [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-changed-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.293 187003 DEBUG nova.compute.manager [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing instance network info cache due to event network-changed-4fbe252d-e231-4421-9a71-f8470765731a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.293 187003 DEBUG oslo_concurrency.lockutils [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:04:55 compute-0 nova_compute[186999]: 2025-11-24 02:04:55.355 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.076 187003 DEBUG nova.network.neutron [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.094 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.094 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance network_info: |[{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.095 187003 DEBUG oslo_concurrency.lockutils [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.096 187003 DEBUG nova.network.neutron [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.101 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Start _get_guest_xml network_info=[{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.108 187003 WARNING nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.117 187003 DEBUG nova.virt.libvirt.host [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.118 187003 DEBUG nova.virt.libvirt.host [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.123 187003 DEBUG nova.virt.libvirt.host [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.124 187003 DEBUG nova.virt.libvirt.host [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.124 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.125 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.126 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.126 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.127 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.127 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.128 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.128 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.129 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.129 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.129 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.130 187003 DEBUG nova.virt.hardware [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.137 187003 DEBUG nova.virt.libvirt.vif [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:04:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1543136382',display_name='tempest-TestNetworkBasicOps-server-1543136382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1543136382',id=11,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSP5FYoevblVVxr+BlHTysiWoOrbVN2UVjIu6ow/i8LBB5RNm/LYmCpco9bNSaiFRAxNFEdqZvYlD2+9SJuOtadsfugvNA6DYV5TI4dIdeRKmrGhNemySVx7Nw/dvl0WA==',key_name='tempest-TestNetworkBasicOps-14717754',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-foj6r87p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:04:53Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=b870f828-e429-4acb-8457-dd2521c13114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.138 187003 DEBUG nova.network.os_vif_util [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.139 187003 DEBUG nova.network.os_vif_util [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.140 187003 DEBUG nova.objects.instance [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid b870f828-e429-4acb-8457-dd2521c13114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.152 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <uuid>b870f828-e429-4acb-8457-dd2521c13114</uuid>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <name>instance-0000000b</name>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-1543136382</nova:name>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:04:56</nova:creationTime>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         <nova:port uuid="4fbe252d-e231-4421-9a71-f8470765731a">
Nov 24 02:04:56 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <system>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="serial">b870f828-e429-4acb-8457-dd2521c13114</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="uuid">b870f828-e429-4acb-8457-dd2521c13114</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </system>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <os>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </os>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <features>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </features>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.config"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:30:a0:3b"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <target dev="tap4fbe252d-e2"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/console.log" append="off"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <video>
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </video>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:04:56 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:04:56 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:04:56 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:04:56 compute-0 nova_compute[186999]: </domain>
Nov 24 02:04:56 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.153 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Preparing to wait for external event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.154 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.155 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.155 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.156 187003 DEBUG nova.virt.libvirt.vif [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:04:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1543136382',display_name='tempest-TestNetworkBasicOps-server-1543136382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1543136382',id=11,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSP5FYoevblVVxr+BlHTysiWoOrbVN2UVjIu6ow/i8LBB5RNm/LYmCpco9bNSaiFRAxNFEdqZvYlD2+9SJuOtadsfugvNA6DYV5TI4dIdeRKmrGhNemySVx7Nw/dvl0WA==',key_name='tempest-TestNetworkBasicOps-14717754',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-foj6r87p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:04:53Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=b870f828-e429-4acb-8457-dd2521c13114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.157 187003 DEBUG nova.network.os_vif_util [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.158 187003 DEBUG nova.network.os_vif_util [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.159 187003 DEBUG os_vif [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.159 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.160 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.161 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.165 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.166 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4fbe252d-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.166 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4fbe252d-e2, col_values=(('external_ids', {'iface-id': '4fbe252d-e231-4421-9a71-f8470765731a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:a0:3b', 'vm-uuid': 'b870f828-e429-4acb-8457-dd2521c13114'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.168 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.1706] manager: (tap4fbe252d-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.172 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.175 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.176 187003 INFO os_vif [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2')
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.230 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.231 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.231 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:30:a0:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.232 187003 INFO nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Using config drive
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.528 187003 INFO nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Creating config drive at /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.config
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.528 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.534 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7lyklob execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.674 187003 DEBUG oslo_concurrency.processutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7lyklob" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:04:56 compute-0 kernel: tap4fbe252d-e2: entered promiscuous mode
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.756 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.7576] manager: (tap4fbe252d-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 24 02:04:56 compute-0 ovn_controller[95380]: 2025-11-24T02:04:56Z|00137|binding|INFO|Claiming lport 4fbe252d-e231-4421-9a71-f8470765731a for this chassis.
Nov 24 02:04:56 compute-0 ovn_controller[95380]: 2025-11-24T02:04:56Z|00138|binding|INFO|4fbe252d-e231-4421-9a71-f8470765731a: Claiming fa:16:3e:30:a0:3b 10.100.0.3
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.770 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.795 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:a0:3b 10.100.0.3'], port_security=['fa:16:3e:30:a0:3b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '882b8133-50bb-4df0-b17c-f8a6f2c6d8a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb1fb74-fc05-473e-b6a7-f2e41e415ed2, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=4fbe252d-e231-4421-9a71-f8470765731a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.796 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 4fbe252d-e231-4421-9a71-f8470765731a in datapath 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 bound to our chassis
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.797 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737
Nov 24 02:04:56 compute-0 systemd-udevd[218169]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.812 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c682675b-0dc3-4173-99ec-4065f5b94275]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.815 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap67be9a0e-01 in ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.8191] device (tap4fbe252d-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:04:56 compute-0 systemd-machined[153319]: New machine qemu-11-instance-0000000b.
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.819 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap67be9a0e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.8212] device (tap4fbe252d-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.819 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1d5c31-90bd-4a98-8fdb-f9d088537ff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.821 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2a954d-cbb4-42c5-b44e-17ea30e953b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.837 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[75ef4363-0290-4ddd-aefb-f3110188010a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_controller[95380]: 2025-11-24T02:04:56Z|00139|binding|INFO|Setting lport 4fbe252d-e231-4421-9a71-f8470765731a ovn-installed in OVS
Nov 24 02:04:56 compute-0 ovn_controller[95380]: 2025-11-24T02:04:56Z|00140|binding|INFO|Setting lport 4fbe252d-e231-4421-9a71-f8470765731a up in Southbound
Nov 24 02:04:56 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Nov 24 02:04:56 compute-0 nova_compute[186999]: 2025-11-24 02:04:56.858 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.870 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8248fc-8c16-425c-9990-072fc15ab5f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.903 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0c114e-deb8-48d9-aeee-75d07747f86e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.908 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[059275c1-de0b-4c24-9096-9a133e1f43b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.9097] manager: (tap67be9a0e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.952 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[a84677ed-386a-48cc-aff6-b6e9a52e2db0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.956 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1aebaf-c898-4b68-a682-2082ddbaa38a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:56 compute-0 NetworkManager[55458]: <info>  [1763949896.9868] device (tap67be9a0e-00): carrier: link connected
Nov 24 02:04:56 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:56.994 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a8f64c-7192-49be-b0a0-01d62e55b797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.016 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb8e607-8d7f-4996-90a3-c02ecf735d20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67be9a0e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:c9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336364, 'reachable_time': 39192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218203, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.043 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1926f60b-9239-4f1d-a83b-5b8583c549c3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe42:c946'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336364, 'tstamp': 336364}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218204, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.066 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbb408c-e2fa-49f1-b3ed-790eed242823]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67be9a0e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:c9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336364, 'reachable_time': 39192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218205, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.114 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c0eb1337-a10e-48fe-894b-3c5f1b54e21a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.208 187003 DEBUG nova.network.neutron [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updated VIF entry in instance network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.208 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[820d0e50-9cd8-4d3f-adfd-ba4c2474d614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.209 187003 DEBUG nova.network.neutron [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.209 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67be9a0e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.210 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.210 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67be9a0e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:57 compute-0 kernel: tap67be9a0e-00: entered promiscuous mode
Nov 24 02:04:57 compute-0 NetworkManager[55458]: <info>  [1763949897.2148] manager: (tap67be9a0e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.212 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.217 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap67be9a0e-00, col_values=(('external_ids', {'iface-id': 'f321bed4-e0fc-4886-ba7a-71eb9cae7cc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:04:57 compute-0 ovn_controller[95380]: 2025-11-24T02:04:57Z|00141|binding|INFO|Releasing lport f321bed4-e0fc-4886-ba7a-71eb9cae7cc4 from this chassis (sb_readonly=0)
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.222 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/67be9a0e-0da1-48ec-8b2b-8b93cf4e1737.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/67be9a0e-0da1-48ec-8b2b-8b93cf4e1737.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.223 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[38e3a59a-9e52-42b8-aa40-4b2e7c1cbbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.224 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/67be9a0e-0da1-48ec-8b2b-8b93cf4e1737.pid.haproxy
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.224 187003 DEBUG oslo_concurrency.lockutils [req-7d9e257c-af43-4bd9-a3bf-b0fc55ac0c70 req-807a5494-8f33-4b7f-bf92-7eb94096f9b3 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:04:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:04:57.225 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'env', 'PROCESS_TAG=haproxy-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/67be9a0e-0da1-48ec-8b2b-8b93cf4e1737.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.230 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.303 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949897.30272, b870f828-e429-4acb-8457-dd2521c13114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.303 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] VM Started (Lifecycle Event)
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.320 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.330 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949897.3042452, b870f828-e429-4acb-8457-dd2521c13114 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.331 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] VM Paused (Lifecycle Event)
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.358 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.361 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.373 187003 DEBUG nova.compute.manager [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.374 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.374 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.374 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.375 187003 DEBUG nova.compute.manager [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Processing event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.375 187003 DEBUG nova.compute.manager [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.375 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.375 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.376 187003 DEBUG oslo_concurrency.lockutils [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.376 187003 DEBUG nova.compute.manager [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.376 187003 WARNING nova.compute.manager [req-5ee3461f-e063-482c-a45a-687f0086611b req-3decf35d-9055-44c8-a484-1961438a61e5 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state building and task_state spawning.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.377 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.385 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.386 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949897.3819697, b870f828-e429-4acb-8457-dd2521c13114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.386 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] VM Resumed (Lifecycle Event)
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.408 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.411 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.413 187003 INFO nova.virt.libvirt.driver [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance spawned successfully.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.414 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.416 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.446 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.454 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.454 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.455 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.455 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.455 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.456 187003 DEBUG nova.virt.libvirt.driver [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.515 187003 INFO nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Took 4.11 seconds to spawn the instance on the hypervisor.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.516 187003 DEBUG nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.583 187003 INFO nova.compute.manager [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Took 4.51 seconds to build instance.
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.601 187003 DEBUG oslo_concurrency.lockutils [None req-9256ff43-b86f-4a3a-a2a2-5c0f6ad086da e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:04:57 compute-0 nova_compute[186999]: 2025-11-24 02:04:57.629 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:04:57 compute-0 podman[218245]: 2025-11-24 02:04:57.635635163 +0000 UTC m=+0.054858267 container create aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:04:57 compute-0 systemd[1]: Started libpod-conmon-aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8.scope.
Nov 24 02:04:57 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d253a647779153898389aa5cb499205007e06db27cbe41cb63df881fccfed4c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:04:57 compute-0 podman[218245]: 2025-11-24 02:04:57.608241666 +0000 UTC m=+0.027464790 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:04:57 compute-0 podman[218245]: 2025-11-24 02:04:57.71374892 +0000 UTC m=+0.132972024 container init aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:04:57 compute-0 podman[218245]: 2025-11-24 02:04:57.719660946 +0000 UTC m=+0.138884050 container start aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 02:04:57 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [NOTICE]   (218264) : New worker (218266) forked
Nov 24 02:04:57 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [NOTICE]   (218264) : Loading success.
Nov 24 02:04:59 compute-0 podman[218275]: 2025-11-24 02:04:59.826391167 +0000 UTC m=+0.068654093 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 24 02:05:00 compute-0 ovn_controller[95380]: 2025-11-24T02:05:00Z|00142|binding|INFO|Releasing lport f321bed4-e0fc-4886-ba7a-71eb9cae7cc4 from this chassis (sb_readonly=0)
Nov 24 02:05:00 compute-0 NetworkManager[55458]: <info>  [1763949900.9732] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 24 02:05:00 compute-0 NetworkManager[55458]: <info>  [1763949900.9741] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 24 02:05:00 compute-0 nova_compute[186999]: 2025-11-24 02:05:00.980 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:01 compute-0 ovn_controller[95380]: 2025-11-24T02:05:01Z|00143|binding|INFO|Releasing lport f321bed4-e0fc-4886-ba7a-71eb9cae7cc4 from this chassis (sb_readonly=0)
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.008 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.013 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.168 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.864 187003 DEBUG nova.compute.manager [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-changed-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.864 187003 DEBUG nova.compute.manager [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing instance network info cache due to event network-changed-4fbe252d-e231-4421-9a71-f8470765731a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.865 187003 DEBUG oslo_concurrency.lockutils [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.865 187003 DEBUG oslo_concurrency.lockutils [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:01 compute-0 nova_compute[186999]: 2025-11-24 02:05:01.865 187003 DEBUG nova.network.neutron [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:02 compute-0 nova_compute[186999]: 2025-11-24 02:05:02.631 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:03 compute-0 nova_compute[186999]: 2025-11-24 02:05:03.102 187003 DEBUG nova.network.neutron [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updated VIF entry in instance network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:05:03 compute-0 nova_compute[186999]: 2025-11-24 02:05:03.103 187003 DEBUG nova.network.neutron [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:03 compute-0 nova_compute[186999]: 2025-11-24 02:05:03.119 187003 DEBUG oslo_concurrency.lockutils [req-256efcd8-91c9-45fc-9b32-bf535d3951a0 req-cfef5a25-e42b-4602-9059-89d2b5afc432 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:04 compute-0 podman[218298]: 2025-11-24 02:05:04.851565307 +0000 UTC m=+0.094914819 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.177 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.605 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.606 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.621 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.727 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.728 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.739 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.739 187003 INFO nova.compute.claims [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.832 187003 DEBUG nova.compute.provider_tree [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.844 187003 DEBUG nova.scheduler.client.report [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.873 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.874 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.940 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.941 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.958 187003 INFO nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:05:06 compute-0 nova_compute[186999]: 2025-11-24 02:05:06.971 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.049 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.050 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.051 187003 INFO nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Creating image(s)
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.051 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.052 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.053 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.067 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.123 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.124 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.125 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.138 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.194 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.195 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.235 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.237 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.237 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.297 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.299 187003 DEBUG nova.virt.disk.api [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.300 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.364 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.366 187003 DEBUG nova.virt.disk.api [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.367 187003 DEBUG nova.objects.instance [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.380 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.381 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Ensure instance console log exists: /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.382 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.383 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.383 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.633 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:07 compute-0 nova_compute[186999]: 2025-11-24 02:05:07.700 187003 DEBUG nova.policy [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:05:08 compute-0 nova_compute[186999]: 2025-11-24 02:05:08.803 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Successfully created port: 69e1c86d-39c3-43f3-9c75-c7ad5c634510 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:05:09 compute-0 ovn_controller[95380]: 2025-11-24T02:05:09Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:a0:3b 10.100.0.3
Nov 24 02:05:09 compute-0 ovn_controller[95380]: 2025-11-24T02:05:09Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:a0:3b 10.100.0.3
Nov 24 02:05:09 compute-0 nova_compute[186999]: 2025-11-24 02:05:09.988 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Successfully updated port: 69e1c86d-39c3-43f3-9c75-c7ad5c634510 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.006 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.007 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.007 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.080 187003 DEBUG nova.compute.manager [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.080 187003 DEBUG nova.compute.manager [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing instance network info cache due to event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.081 187003 DEBUG oslo_concurrency.lockutils [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.134 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:05:10 compute-0 sshd-session[218351]: Received disconnect from 154.90.59.75 port 51872:11: Bye Bye [preauth]
Nov 24 02:05:10 compute-0 sshd-session[218351]: Disconnected from authenticating user root 154.90.59.75 port 51872 [preauth]
Nov 24 02:05:10 compute-0 nova_compute[186999]: 2025-11-24 02:05:10.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:10 compute-0 podman[218353]: 2025-11-24 02:05:10.833114577 +0000 UTC m=+0.070840384 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.165 187003 DEBUG nova.network.neutron [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updating instance_info_cache with network_info: [{"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.181 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.184 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.184 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Instance network_info: |[{"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.185 187003 DEBUG oslo_concurrency.lockutils [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.185 187003 DEBUG nova.network.neutron [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing network info cache for port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.189 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Start _get_guest_xml network_info=[{"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.194 187003 WARNING nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.207 187003 DEBUG nova.virt.libvirt.host [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.208 187003 DEBUG nova.virt.libvirt.host [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.213 187003 DEBUG nova.virt.libvirt.host [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.214 187003 DEBUG nova.virt.libvirt.host [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.214 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.215 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.215 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.216 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.216 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.216 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.216 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.217 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.217 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.217 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.218 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.218 187003 DEBUG nova.virt.hardware [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.222 187003 DEBUG nova.virt.libvirt.vif [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-635875062',display_name='tempest-TestNetworkBasicOps-server-635875062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-635875062',id=12,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3waBa+eaFTNad4ti0sb6tqXDSl7UQTV+zGYr2FSTipX/CakXtx3nt2OpV5D9CUhosGHK2G/rPPzAtAN7xSIrXM0tRHOzGxafmEnHELoZXaGBFyEURrReci1+oNCR09zg==',key_name='tempest-TestNetworkBasicOps-1320781412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-2m1q606m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:05:07Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=5ddfb970-cf5c-460e-abac-d0f07ffe05c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.223 187003 DEBUG nova.network.os_vif_util [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.224 187003 DEBUG nova.network.os_vif_util [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.225 187003 DEBUG nova.objects.instance [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.242 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <uuid>5ddfb970-cf5c-460e-abac-d0f07ffe05c1</uuid>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <name>instance-0000000c</name>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-635875062</nova:name>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:05:11</nova:creationTime>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         <nova:port uuid="69e1c86d-39c3-43f3-9c75-c7ad5c634510">
Nov 24 02:05:11 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <system>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="serial">5ddfb970-cf5c-460e-abac-d0f07ffe05c1</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="uuid">5ddfb970-cf5c-460e-abac-d0f07ffe05c1</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </system>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <os>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </os>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <features>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </features>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.config"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:df:0e:c1"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <target dev="tap69e1c86d-39"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/console.log" append="off"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <video>
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </video>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:05:11 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:05:11 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:05:11 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:05:11 compute-0 nova_compute[186999]: </domain>
Nov 24 02:05:11 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.244 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Preparing to wait for external event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.245 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.245 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.245 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.246 187003 DEBUG nova.virt.libvirt.vif [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-635875062',display_name='tempest-TestNetworkBasicOps-server-635875062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-635875062',id=12,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3waBa+eaFTNad4ti0sb6tqXDSl7UQTV+zGYr2FSTipX/CakXtx3nt2OpV5D9CUhosGHK2G/rPPzAtAN7xSIrXM0tRHOzGxafmEnHELoZXaGBFyEURrReci1+oNCR09zg==',key_name='tempest-TestNetworkBasicOps-1320781412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-2m1q606m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:05:07Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=5ddfb970-cf5c-460e-abac-d0f07ffe05c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.246 187003 DEBUG nova.network.os_vif_util [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.247 187003 DEBUG nova.network.os_vif_util [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.247 187003 DEBUG os_vif [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.247 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.248 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.248 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.252 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.253 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69e1c86d-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.252 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b870f828-e429-4acb-8457-dd2521c13114', 'name': 'tempest-TestNetworkBasicOps-server-1543136382', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'hostId': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.254 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69e1c86d-39, col_values=(('external_ids', {'iface-id': '69e1c86d-39c3-43f3-9c75-c7ad5c634510', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:0e:c1', 'vm-uuid': '5ddfb970-cf5c-460e-abac-d0f07ffe05c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.257 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:11 compute-0 NetworkManager[55458]: <info>  [1763949911.2583] manager: (tap69e1c86d-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.262 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.265 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.267 187003 INFO os_vif [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39')
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.290 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.290 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b489699a-1f6d-4558-ab4d-13cec3542a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.253700', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '02253fa4-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'df36de85dc19d9266123f6b56ce1edc59ccbfa4ab67180decdbdf151d3d559e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.253700', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '02254ddc-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '3fdae4a559b446b5f2f4894b9ce8d571ab5a7dfec7fcb5b4d2a88bbdde98b4a7'}]}, 'timestamp': '2025-11-24 02:05:11.291229', '_unique_id': 'a8ca7ac44e704ffbb36343317c796f5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.292 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.297 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b870f828-e429-4acb-8457-dd2521c13114 / tap4fbe252d-e2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.297 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd9907f9-b750-4c74-9f5f-64b312458484', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.293329', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '02265790-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': 'db3cac8d99ed5e48ce43fb09343a02fb4fd500b60228dca0ccac3f76f0612572'}]}, 'timestamp': '2025-11-24 02:05:11.298050', '_unique_id': '454dfe57a2134c94ac32bf7852c173b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.298 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.299 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.299 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c26fc01c-3e9c-45e3-b103-dc919416e004', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.299763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '0226a920-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': 'd0f4cdad2efebaeb6e211ec9fcef15fbf67fa062884a0156711d0916be2ccf93'}]}, 'timestamp': '2025-11-24 02:05:11.300051', '_unique_id': '384c21275e794756b1cadab38ae9ccbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.300 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.301 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe2384d0-8db3-4846-ba0a-d1b6c8cab99f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.301373', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '0226e6ec-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': 'de6b20b5d6958d887439c63405f30ec91c2fdd033ecd68da7df694ebd5815bf9'}]}, 'timestamp': '2025-11-24 02:05:11.301621', '_unique_id': '36dd445bd20f410aa5ade2f1b2582248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.302 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.303 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.requests volume: 279 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.303 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eca2a469-567a-4404-8641-1db62fae370d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 279, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.303072', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '02272a12-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '471fa3b46d3c646de5dd6e182e37d4cb358899eaaf282515d8ab0bd943c796fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.303072', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '02273264-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '5c9741308ccfabc8f91a3e844a6ed1377768f7d7a732e1c094cc08195d27abc2'}]}, 'timestamp': '2025-11-24 02:05:11.303529', '_unique_id': 'f479e892d74449fcbcaba369e3f2ba6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.304 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b0723f6-a1b0-460b-b5b4-219c63db8153', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.304845', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '02276f7c-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'd78b08bd86c0cd727fb42d306b2fba8a73654e84c8f729a40b5b497938c2dec3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.304845', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '02277756-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'f6e58687d88a4d81aec172018b5856f4aff330cf12f60225cb32d152e33852e0'}]}, 'timestamp': '2025-11-24 02:05:11.305297', '_unique_id': 'd6beeb94fee2401c95fae6d2159177a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.305 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.306 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '139766d4-cff7-48f4-a3bd-dc6aa8ead943', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.306585', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '0227b252-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': 'c44eab160afea15794f155630d2ad7607dddd4144b39abe801513fecee2653fc'}]}, 'timestamp': '2025-11-24 02:05:11.306829', '_unique_id': '6d36aec71c1b4ffa9655a342a39812c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.317 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.318 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.318 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:df:0e:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.318 187003 INFO nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Using config drive
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.319 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.319 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e44f863-24e7-4bd3-bd51-363df9ea4dbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.307972', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0229a27e-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': '40ac77a9faf448722ca4dae9512c34075e74a1ff741220f07c6b4880c9a47ce3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.307972', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0229addc-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': 'fed56070d561e4fd2f7f4f894a29fc67f46e65d26e1ce1ae1696247b83c3b91b'}]}, 'timestamp': '2025-11-24 02:05:11.319814', '_unique_id': 'f1915038b3b94ec1801b94f6e20d1faa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.320 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.321 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b786d58-1956-49f1-9601-96dd3795c000', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.321785', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '022a0534-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'e2dc816c2af6ebb4f7daf48ecdf722faec0117cfd1713d402e7409f43aaeec88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.321785', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '022a0d90-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'e012cfe906b8b18baf40656b6dde91d9cc9049b53153ba61f84f2b47dad6b80e'}]}, 'timestamp': '2025-11-24 02:05:11.322249', '_unique_id': 'b7d8940961fc42958535e231edb949fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.323 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.323 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>]
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.323 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.338 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/memory.usage volume: 40.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '190d74db-d111-4764-8ddb-4a6d0731afda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40234375, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'timestamp': '2025-11-24T02:05:11.323726', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '022c8412-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.054066643, 'message_signature': 'cc7bf8345bbd65c43747cdaf0e961a6b827a14cb77a575daed4dd15d8d675bac'}]}, 'timestamp': '2025-11-24 02:05:11.338485', '_unique_id': 'd581e541a1b6494db3d01835fcd2e63b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.339 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>]
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>]
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1543136382>]
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e71376-171b-47a4-bdca-09a8b3c42ca8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.341015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022cf37a-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': '55f58915ec997802cc43a3d95872274c391b809bfe7a39a88feb406c716124ab'}]}, 'timestamp': '2025-11-24 02:05:11.341303', '_unique_id': '11729079fa704540bc83bd221471b75e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.341 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.342 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.342 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aabdd4d-fe5f-4650-b08d-59d878f91f27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.342484', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022d2c5a-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': '193a9d6fb2022eed1327cbd83e952d31e61f9aa63f01a7b4b4e9626ab40dd9f3'}]}, 'timestamp': '2025-11-24 02:05:11.342710', '_unique_id': '538c097199d14fce927eee4d93749e1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.343 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.latency volume: 1785025625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2e00bee-b3fb-4d6d-abf3-cee3fe292042', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1785025625, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.343777', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '022d5f9a-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '8980328656c23911fb1bb4d6b7e00211f89dbbd7f00216f7a525087d604a6eba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.343777', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '022d6788-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '3c8c7ef6a875875bef0d737dbb5eb167c505d94732c5fa2475a87040d9e05c5f'}]}, 'timestamp': '2025-11-24 02:05:11.344210', '_unique_id': '317d131b8d894725bc5349fef6d1a4f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.344 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.345 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.345 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fbd3ddc-ed49-49a7-9d05-9182f8eacc7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.345318', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '022d9b04-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': 'a3ec698eb53e7acbbc88ce07a884fe7bf30c915710f64c435e77874d26fb0d75'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.345318', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '022da2de-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': '13fd91b2420f8b5b41b3fdec1412fbf55efbf5da38f2d3d20f44025c16732890'}]}, 'timestamp': '2025-11-24 02:05:11.345728', '_unique_id': '20069c7c97084e8a8cf63ec1961f01e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.346 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.latency volume: 510433162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.read.latency volume: 51479149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29c781f5-ee3f-440d-b165-c7334b0b3535', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 510433162, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.346798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '022dd592-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': '2bf6a13124a1a742667c3658b60ff4118473c4cf6364a561992be59f265fdd60'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51479149, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.346798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '022ddd62-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3377.969879236, 'message_signature': 'dd2979e6c524c8d11e4e8611def84c4a40491c061a2d18b84880a2f51aa92e97'}]}, 'timestamp': '2025-11-24 02:05:11.347228', '_unique_id': '07ee8f4782f141c7af84d11b7f6fc955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.347 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '624a6f3c-e7ab-4b18-8f6f-7b9df28f6ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.348301', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022e0f94-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': '1d9965d415e8b6de8b3e0b3ad538e5d814525aba33885765556c1c2a5702e97c'}]}, 'timestamp': '2025-11-24 02:05:11.348526', '_unique_id': 'bb5dbbcf891e455a83861fe20aa7b654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.349 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.349 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.349 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95420e91-5246-49cc-b5c3-651031aa3b31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-vda', 'timestamp': '2025-11-24T02:05:11.349599', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '022e4220-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': 'f404f676d2655eee7d8015be61d00b875e60980a951c9b3892be2fe2374f5ecf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114-sda', 'timestamp': '2025-11-24T02:05:11.349599', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '022e4a90-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.024154726, 'message_signature': '5c97b2f7530ad4e5252c17dbbd4bf15b185048f1da6b258eebf327b4efbf848f'}]}, 'timestamp': '2025-11-24 02:05:11.350023', '_unique_id': '957add6f6b784566a19d52e6ec7f8e90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b015c65d-890e-48c2-a58b-da3d892e492e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.351104', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022e7d12-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': 'afb84cfb09913c3e323c325cbb9fca832edfd1d417ad336e3c2528a4d90786b1'}]}, 'timestamp': '2025-11-24 02:05:11.351329', '_unique_id': 'c5d3b3a035bd481baf1d350865c301e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.351 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.352 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '310dd729-d070-4be8-abe9-7c2a6fa3e6ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.352368', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022eae68-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': '890b32c0deb01f1f0f1f8abd4413a1d09ec4a17994e4ff867bb4b3a4a780482c'}]}, 'timestamp': '2025-11-24 02:05:11.352592', '_unique_id': '1a15378db3cf4903ab4d877e429eaa71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.353 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/cpu volume: 11440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '257bdbe8-52cc-41af-9ab0-005ab2cb7296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11440000000, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'timestamp': '2025-11-24T02:05:11.353690', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'instance-0000000b', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '022ee1ee-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.054066643, 'message_signature': 'd03b50f844877923f2644059ee1da2348c01cd4dcf7e8ca8d44ea8d9a3ee4f84'}]}, 'timestamp': '2025-11-24 02:05:11.353934', '_unique_id': '68114eeb37544bc78d459584d17ecd05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.354 12 DEBUG ceilometer.compute.pollsters [-] b870f828-e429-4acb-8457-dd2521c13114/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf747282-4cb2-4b0d-878f-531c6c9ea927', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_name': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_name': None, 'resource_id': 'instance-0000000b-b870f828-e429-4acb-8457-dd2521c13114-tap4fbe252d-e2', 'timestamp': '2025-11-24T02:05:11.354979', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1543136382', 'name': 'tap4fbe252d-e2', 'instance_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'instance_type': 'm1.nano', 'host': '709fab6a6f75f3004c84048b1986a386b4883f5e395010781f63be5e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1e8dafc-0e0f-4b06-ab61-2691966769fd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b6697012-8086-43d5-999a-6bb711240eaa'}, 'image_ref': 'b6697012-8086-43d5-999a-6bb711240eaa', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:a0:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4fbe252d-e2'}, 'message_id': '022f145c-c8da-11f0-959b-fa163eb968c1', 'monotonic_time': 3378.009521136, 'message_signature': '42e2fc8afa0bf4645aacfaa6bb9415e1f87a0a9856725275a1a22071abe56566'}]}, 'timestamp': '2025-11-24 02:05:11.355202', '_unique_id': 'f5191a63e31f4d00a530614d2a4e0571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 02:05:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:05:11.355 12 ERROR oslo_messaging.notify.messaging 
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.872 187003 INFO nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Creating config drive at /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.config
Nov 24 02:05:11 compute-0 nova_compute[186999]: 2025-11-24 02:05:11.877 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmsmxfx9e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.001 187003 DEBUG oslo_concurrency.processutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmsmxfx9e" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:12 compute-0 kernel: tap69e1c86d-39: entered promiscuous mode
Nov 24 02:05:12 compute-0 NetworkManager[55458]: <info>  [1763949912.0778] manager: (tap69e1c86d-39): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Nov 24 02:05:12 compute-0 ovn_controller[95380]: 2025-11-24T02:05:12Z|00144|binding|INFO|Claiming lport 69e1c86d-39c3-43f3-9c75-c7ad5c634510 for this chassis.
Nov 24 02:05:12 compute-0 ovn_controller[95380]: 2025-11-24T02:05:12Z|00145|binding|INFO|69e1c86d-39c3-43f3-9c75-c7ad5c634510: Claiming fa:16:3e:df:0e:c1 10.100.0.14
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.079 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.086 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:0e:c1 10.100.0.14'], port_security=['fa:16:3e:df:0e:c1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5ddfb970-cf5c-460e-abac-d0f07ffe05c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '23dee10e-6408-415d-97df-89c09653122e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb1fb74-fc05-473e-b6a7-f2e41e415ed2, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=69e1c86d-39c3-43f3-9c75-c7ad5c634510) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.087 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 in datapath 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 bound to our chassis
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.088 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737
Nov 24 02:05:12 compute-0 ovn_controller[95380]: 2025-11-24T02:05:12Z|00146|binding|INFO|Setting lport 69e1c86d-39c3-43f3-9c75-c7ad5c634510 up in Southbound
Nov 24 02:05:12 compute-0 ovn_controller[95380]: 2025-11-24T02:05:12Z|00147|binding|INFO|Setting lport 69e1c86d-39c3-43f3-9c75-c7ad5c634510 ovn-installed in OVS
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.095 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.098 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.114 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9e0436-e51d-4898-9dfa-65561e67d15e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 systemd-machined[153319]: New machine qemu-12-instance-0000000c.
Nov 24 02:05:12 compute-0 systemd-udevd[218397]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:05:12 compute-0 NetworkManager[55458]: <info>  [1763949912.1385] device (tap69e1c86d-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:05:12 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Nov 24 02:05:12 compute-0 NetworkManager[55458]: <info>  [1763949912.1396] device (tap69e1c86d-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.151 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[71458505-ee30-4d60-965e-b37177a06d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.155 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[e25b0d31-67b2-40a8-abf1-826b92d1b85d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.185 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[a72df5e1-8c75-4b42-9e73-66cbda274c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.204 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[30c1f14c-e671-4d2a-84eb-6bbe336c4312]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67be9a0e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:c9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336364, 'reachable_time': 39192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218408, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.221 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ca2d13-bd83-448f-bcdd-ea339ac23228]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap67be9a0e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336381, 'tstamp': 336381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218410, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap67be9a0e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336386, 'tstamp': 336386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218410, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.223 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67be9a0e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.281 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.283 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.284 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67be9a0e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.284 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.285 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap67be9a0e-00, col_values=(('external_ids', {'iface-id': 'f321bed4-e0fc-4886-ba7a-71eb9cae7cc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:12 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:12.285 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.288 187003 DEBUG nova.compute.manager [req-6d28b339-d810-4465-be3d-7671c06d635f req-1ccde021-2db9-471a-92f9-9c69bcf4e980 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.288 187003 DEBUG oslo_concurrency.lockutils [req-6d28b339-d810-4465-be3d-7671c06d635f req-1ccde021-2db9-471a-92f9-9c69bcf4e980 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.288 187003 DEBUG oslo_concurrency.lockutils [req-6d28b339-d810-4465-be3d-7671c06d635f req-1ccde021-2db9-471a-92f9-9c69bcf4e980 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.289 187003 DEBUG oslo_concurrency.lockutils [req-6d28b339-d810-4465-be3d-7671c06d635f req-1ccde021-2db9-471a-92f9-9c69bcf4e980 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.289 187003 DEBUG nova.compute.manager [req-6d28b339-d810-4465-be3d-7671c06d635f req-1ccde021-2db9-471a-92f9-9c69bcf4e980 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Processing event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.521 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949912.5210488, 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.522 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] VM Started (Lifecycle Event)
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.523 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.527 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.531 187003 INFO nova.virt.libvirt.driver [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Instance spawned successfully.
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.532 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.537 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.540 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.553 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.553 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.553 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.554 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.554 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.554 187003 DEBUG nova.virt.libvirt.driver [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.559 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.559 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949912.5212142, 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.559 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] VM Paused (Lifecycle Event)
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.578 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.582 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949912.5262845, 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.582 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] VM Resumed (Lifecycle Event)
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.602 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.606 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.611 187003 INFO nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Took 5.56 seconds to spawn the instance on the hypervisor.
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.611 187003 DEBUG nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.630 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.636 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.660 187003 INFO nova.compute.manager [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Took 5.97 seconds to build instance.
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.677 187003 DEBUG oslo_concurrency.lockutils [None req-6548e468-e23a-4fc5-a636-b7699ae32d5a e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.838 187003 DEBUG nova.network.neutron [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updated VIF entry in instance network info cache for port 69e1c86d-39c3-43f3-9c75-c7ad5c634510. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.838 187003 DEBUG nova.network.neutron [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updating instance_info_cache with network_info: [{"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.855 187003 DEBUG oslo_concurrency.lockutils [req-073f2aa6-8f1f-4d1f-8f20-5bf418d0f6dc req-525b9403-9683-4d05-b38a-0f1a79d1b2d2 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.919 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.919 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.920 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 02:05:12 compute-0 nova_compute[186999]: 2025-11-24 02:05:12.920 187003 DEBUG nova.objects.instance [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b870f828-e429-4acb-8457-dd2521c13114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.307 187003 DEBUG nova.network.neutron [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.320 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.320 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.384 187003 DEBUG nova.compute.manager [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.385 187003 DEBUG oslo_concurrency.lockutils [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.385 187003 DEBUG oslo_concurrency.lockutils [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.386 187003 DEBUG oslo_concurrency.lockutils [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.386 187003 DEBUG nova.compute.manager [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] No waiting events found dispatching network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:14 compute-0 nova_compute[186999]: 2025-11-24 02:05:14.386 187003 WARNING nova.compute.manager [req-a1c33eef-f6c2-4e38-a960-da378d553923 req-01bf4a07-301e-4b88-b00e-5e525a80f203 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received unexpected event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 for instance with vm_state active and task_state None.
Nov 24 02:05:14 compute-0 podman[218419]: 2025-11-24 02:05:14.828874793 +0000 UTC m=+0.067350457 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 24 02:05:15 compute-0 nova_compute[186999]: 2025-11-24 02:05:15.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:15 compute-0 nova_compute[186999]: 2025-11-24 02:05:15.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.257 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.589 187003 DEBUG nova.compute.manager [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.589 187003 DEBUG nova.compute.manager [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing instance network info cache due to event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.590 187003 DEBUG oslo_concurrency.lockutils [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.590 187003 DEBUG oslo_concurrency.lockutils [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.590 187003 DEBUG nova.network.neutron [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing network info cache for port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:16 compute-0 nova_compute[186999]: 2025-11-24 02:05:16.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:17 compute-0 nova_compute[186999]: 2025-11-24 02:05:17.639 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:17 compute-0 nova_compute[186999]: 2025-11-24 02:05:17.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:17 compute-0 nova_compute[186999]: 2025-11-24 02:05:17.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.795 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.796 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:05:18 compute-0 podman[218441]: 2025-11-24 02:05:18.821773643 +0000 UTC m=+0.064854238 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 02:05:18 compute-0 podman[218440]: 2025-11-24 02:05:18.84951997 +0000 UTC m=+0.096268468 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 02:05:18 compute-0 podman[218442]: 2025-11-24 02:05:18.859747907 +0000 UTC m=+0.102026750 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.887 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.951 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:18 compute-0 nova_compute[186999]: 2025-11-24 02:05:18.953 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.024 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.030 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.094 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.095 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.150 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.301 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.302 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5417MB free_disk=73.42948532104492GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.303 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.303 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.385 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance b870f828-e429-4acb-8457-dd2521c13114 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.386 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.386 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.386 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.450 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.463 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.483 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:05:19 compute-0 nova_compute[186999]: 2025-11-24 02:05:19.484 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.261 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.485 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.485 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.712 187003 DEBUG nova.network.neutron [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updated VIF entry in instance network info cache for port 69e1c86d-39c3-43f3-9c75-c7ad5c634510. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.713 187003 DEBUG nova.network.neutron [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updating instance_info_cache with network_info: [{"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:21 compute-0 nova_compute[186999]: 2025-11-24 02:05:21.730 187003 DEBUG oslo_concurrency.lockutils [req-32b7785e-48b6-470c-af4e-eb7d9a1cd8ab req-2d592f30-bb36-49ee-a0e3-1fb5b8fe9a71 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:22 compute-0 nova_compute[186999]: 2025-11-24 02:05:22.642 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:24 compute-0 ovn_controller[95380]: 2025-11-24T02:05:24Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:0e:c1 10.100.0.14
Nov 24 02:05:24 compute-0 ovn_controller[95380]: 2025-11-24T02:05:24Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:0e:c1 10.100.0.14
Nov 24 02:05:26 compute-0 nova_compute[186999]: 2025-11-24 02:05:26.265 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:27 compute-0 nova_compute[186999]: 2025-11-24 02:05:27.645 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:30 compute-0 podman[218534]: 2025-11-24 02:05:30.826506865 +0000 UTC m=+0.075742623 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:05:31 compute-0 nova_compute[186999]: 2025-11-24 02:05:31.268 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:31 compute-0 sshd-session[218540]: Invalid user student from 46.188.119.26 port 38124
Nov 24 02:05:31 compute-0 sshd-session[218540]: Received disconnect from 46.188.119.26 port 38124:11: Bye Bye [preauth]
Nov 24 02:05:31 compute-0 sshd-session[218540]: Disconnected from invalid user student 46.188.119.26 port 38124 [preauth]
Nov 24 02:05:32 compute-0 nova_compute[186999]: 2025-11-24 02:05:32.649 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:32 compute-0 nova_compute[186999]: 2025-11-24 02:05:32.835 187003 INFO nova.compute.manager [None req-19f3fef0-f7af-408e-b4fd-d292f49dd6a1 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Get console output
Nov 24 02:05:32 compute-0 nova_compute[186999]: 2025-11-24 02:05:32.840 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:05:34 compute-0 nova_compute[186999]: 2025-11-24 02:05:34.135 187003 DEBUG nova.compute.manager [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-changed-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:34 compute-0 nova_compute[186999]: 2025-11-24 02:05:34.136 187003 DEBUG nova.compute.manager [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing instance network info cache due to event network-changed-4fbe252d-e231-4421-9a71-f8470765731a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:34 compute-0 nova_compute[186999]: 2025-11-24 02:05:34.138 187003 DEBUG oslo_concurrency.lockutils [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:34 compute-0 nova_compute[186999]: 2025-11-24 02:05:34.138 187003 DEBUG oslo_concurrency.lockutils [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:34 compute-0 nova_compute[186999]: 2025-11-24 02:05:34.138 187003 DEBUG nova.network.neutron [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:35 compute-0 nova_compute[186999]: 2025-11-24 02:05:35.075 187003 INFO nova.compute.manager [None req-d5263dbe-c5e2-42a0-bc05-16b5191fa6f3 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Get console output
Nov 24 02:05:35 compute-0 nova_compute[186999]: 2025-11-24 02:05:35.081 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:05:35 compute-0 nova_compute[186999]: 2025-11-24 02:05:35.707 187003 DEBUG nova.network.neutron [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updated VIF entry in instance network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:05:35 compute-0 nova_compute[186999]: 2025-11-24 02:05:35.708 187003 DEBUG nova.network.neutron [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:35 compute-0 nova_compute[186999]: 2025-11-24 02:05:35.726 187003 DEBUG oslo_concurrency.lockutils [req-1ae6518b-0d83-4599-8e4b-9a7a9bad6565 req-67cb3bc4-9796-4bed-a8c8-c006d4605cee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:35 compute-0 podman[218556]: 2025-11-24 02:05:35.834269616 +0000 UTC m=+0.073139410 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.221 187003 DEBUG nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.222 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.223 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.223 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.223 187003 DEBUG nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.224 187003 WARNING nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state active and task_state None.
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.224 187003 DEBUG nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.224 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.225 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.225 187003 DEBUG oslo_concurrency.lockutils [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.225 187003 DEBUG nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.226 187003 WARNING nova.compute.manager [req-31b335fb-d383-4f12-8826-1b3e9982c201 req-fa12cf8d-76ba-4438-9781-fa2d508e53a1 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state active and task_state None.
Nov 24 02:05:36 compute-0 nova_compute[186999]: 2025-11-24 02:05:36.271 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:37 compute-0 nova_compute[186999]: 2025-11-24 02:05:37.165 187003 INFO nova.compute.manager [None req-2e2973c3-c463-41b8-a6f0-57eb9a36e833 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Get console output
Nov 24 02:05:37 compute-0 nova_compute[186999]: 2025-11-24 02:05:37.171 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:05:37 compute-0 nova_compute[186999]: 2025-11-24 02:05:37.653 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.103 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.104 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.105 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.105 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.106 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.107 187003 INFO nova.compute.manager [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Terminating instance
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.108 187003 DEBUG nova.compute.manager [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:05:38 compute-0 kernel: tap69e1c86d-39 (unregistering): left promiscuous mode
Nov 24 02:05:38 compute-0 NetworkManager[55458]: <info>  [1763949938.1330] device (tap69e1c86d-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.146 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 ovn_controller[95380]: 2025-11-24T02:05:38Z|00148|binding|INFO|Releasing lport 69e1c86d-39c3-43f3-9c75-c7ad5c634510 from this chassis (sb_readonly=0)
Nov 24 02:05:38 compute-0 ovn_controller[95380]: 2025-11-24T02:05:38Z|00149|binding|INFO|Setting lport 69e1c86d-39c3-43f3-9c75-c7ad5c634510 down in Southbound
Nov 24 02:05:38 compute-0 ovn_controller[95380]: 2025-11-24T02:05:38Z|00150|binding|INFO|Removing iface tap69e1c86d-39 ovn-installed in OVS
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.149 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.156 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:0e:c1 10.100.0.14'], port_security=['fa:16:3e:df:0e:c1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5ddfb970-cf5c-460e-abac-d0f07ffe05c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23dee10e-6408-415d-97df-89c09653122e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb1fb74-fc05-473e-b6a7-f2e41e415ed2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=69e1c86d-39c3-43f3-9c75-c7ad5c634510) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.158 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 in datapath 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 unbound from our chassis
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.160 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.166 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.179 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[13963c7a-7814-4c32-ae2c-37d4640d2af4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 24 02:05:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 13.129s CPU time.
Nov 24 02:05:38 compute-0 systemd-machined[153319]: Machine qemu-12-instance-0000000c terminated.
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.211 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2a9d80-51bd-4b57-af73-b8eebbd76e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.216 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[a0acca0f-68c7-4bb4-b5a1-7e007644f200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.255 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[81f8f7af-6bd8-4e0a-a839-f70defba30f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.274 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[be6847b0-b20c-4a29-a24c-3186dcf6b69f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap67be9a0e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:c9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336364, 'reachable_time': 39192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218590, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.292 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd384c4-502b-4228-8077-2e15686cadcf]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap67be9a0e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336381, 'tstamp': 336381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218591, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap67be9a0e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336386, 'tstamp': 336386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218591, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.294 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67be9a0e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.296 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.301 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap67be9a0e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.302 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.302 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap67be9a0e-00, col_values=(('external_ids', {'iface-id': 'f321bed4-e0fc-4886-ba7a-71eb9cae7cc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:38 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:38.302 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.304 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-changed-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.304 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing instance network info cache due to event network-changed-4fbe252d-e231-4421-9a71-f8470765731a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.305 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.305 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.305 187003 DEBUG nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.306 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.379 187003 INFO nova.virt.libvirt.driver [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Instance destroyed successfully.
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.381 187003 DEBUG nova.objects.instance [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.394 187003 DEBUG nova.virt.libvirt.vif [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-635875062',display_name='tempest-TestNetworkBasicOps-server-635875062',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-635875062',id=12,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3waBa+eaFTNad4ti0sb6tqXDSl7UQTV+zGYr2FSTipX/CakXtx3nt2OpV5D9CUhosGHK2G/rPPzAtAN7xSIrXM0tRHOzGxafmEnHELoZXaGBFyEURrReci1+oNCR09zg==',key_name='tempest-TestNetworkBasicOps-1320781412',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:05:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-2m1q606m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:05:12Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=5ddfb970-cf5c-460e-abac-d0f07ffe05c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.394 187003 DEBUG nova.network.os_vif_util [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "address": "fa:16:3e:df:0e:c1", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69e1c86d-39", "ovs_interfaceid": "69e1c86d-39c3-43f3-9c75-c7ad5c634510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.395 187003 DEBUG nova.network.os_vif_util [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.395 187003 DEBUG os_vif [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.400 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.401 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69e1c86d-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.403 187003 DEBUG nova.compute.manager [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-unplugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.403 187003 DEBUG oslo_concurrency.lockutils [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.403 187003 DEBUG oslo_concurrency.lockutils [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.404 187003 DEBUG oslo_concurrency.lockutils [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.404 187003 DEBUG nova.compute.manager [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] No waiting events found dispatching network-vif-unplugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.404 187003 DEBUG nova.compute.manager [req-1c565c54-dc9d-4f68-81bd-30d5d5afcea3 req-087713b2-6798-401d-b959-79fa80a1b89b 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-unplugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.439 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.441 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.445 187003 INFO os_vif [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:0e:c1,bridge_name='br-int',has_traffic_filtering=True,id=69e1c86d-39c3-43f3-9c75-c7ad5c634510,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69e1c86d-39')
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.445 187003 INFO nova.virt.libvirt.driver [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Deleting instance files /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1_del
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.446 187003 INFO nova.virt.libvirt.driver [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Deletion of /var/lib/nova/instances/5ddfb970-cf5c-460e-abac-d0f07ffe05c1_del complete
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.502 187003 INFO nova.compute.manager [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.503 187003 DEBUG oslo.service.loopingcall [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.504 187003 DEBUG nova.compute.manager [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:05:38 compute-0 nova_compute[186999]: 2025-11-24 02:05:38.504 187003 DEBUG nova.network.neutron [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.120 187003 DEBUG nova.network.neutron [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.131 187003 INFO nova.compute.manager [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Took 0.63 seconds to deallocate network for instance.
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.183 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.184 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.249 187003 DEBUG nova.compute.provider_tree [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.261 187003 DEBUG nova.scheduler.client.report [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.281 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.306 187003 INFO nova.scheduler.client.report [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance 5ddfb970-cf5c-460e-abac-d0f07ffe05c1
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.356 187003 DEBUG oslo_concurrency.lockutils [None req-24497546-541e-4f27-9389-cab55a5e9d34 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.410 187003 DEBUG nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updated VIF entry in instance network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.410 187003 DEBUG nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [{"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.426 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.426 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.426 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.426 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.426 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 WARNING nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state active and task_state None.
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.427 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 WARNING nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state active and task_state None.
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 DEBUG nova.compute.manager [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing instance network info cache due to event network-changed-69e1c86d-39c3-43f3-9c75-c7ad5c634510. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.428 187003 DEBUG nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Refreshing network info cache for port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.436 187003 DEBUG nova.compute.utils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.683 187003 INFO nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Port 69e1c86d-39c3-43f3-9c75-c7ad5c634510 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.683 187003 DEBUG nova.network.neutron [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:39 compute-0 nova_compute[186999]: 2025-11-24 02:05:39.699 187003 DEBUG oslo_concurrency.lockutils [req-92636cfc-bc90-4747-86cf-6b477e627b94 req-68697b72-1611-40fb-a8d5-89b939348652 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-5ddfb970-cf5c-460e-abac-d0f07ffe05c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.406 187003 DEBUG nova.compute.manager [req-f56e7d86-267b-4bae-a9d4-50b236a74e06 req-8463778c-db74-4582-aad1-5daf3cd0884d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-deleted-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.498 187003 DEBUG nova.compute.manager [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.499 187003 DEBUG oslo_concurrency.lockutils [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.499 187003 DEBUG oslo_concurrency.lockutils [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.499 187003 DEBUG oslo_concurrency.lockutils [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "5ddfb970-cf5c-460e-abac-d0f07ffe05c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.500 187003 DEBUG nova.compute.manager [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] No waiting events found dispatching network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.500 187003 WARNING nova.compute.manager [req-c220a8ee-fab9-4240-8076-949eab6885ff req-a9a12ec2-ea08-47ab-a190-fc5014671caa 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Received unexpected event network-vif-plugged-69e1c86d-39c3-43f3-9c75-c7ad5c634510 for instance with vm_state deleted and task_state None.
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.602 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.602 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.603 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.603 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.603 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.604 187003 INFO nova.compute.manager [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Terminating instance
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.605 187003 DEBUG nova.compute.manager [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:05:40 compute-0 kernel: tap4fbe252d-e2 (unregistering): left promiscuous mode
Nov 24 02:05:40 compute-0 NetworkManager[55458]: <info>  [1763949940.6300] device (tap4fbe252d-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:05:40 compute-0 ovn_controller[95380]: 2025-11-24T02:05:40Z|00151|binding|INFO|Releasing lport 4fbe252d-e231-4421-9a71-f8470765731a from this chassis (sb_readonly=0)
Nov 24 02:05:40 compute-0 ovn_controller[95380]: 2025-11-24T02:05:40Z|00152|binding|INFO|Setting lport 4fbe252d-e231-4421-9a71-f8470765731a down in Southbound
Nov 24 02:05:40 compute-0 ovn_controller[95380]: 2025-11-24T02:05:40Z|00153|binding|INFO|Removing iface tap4fbe252d-e2 ovn-installed in OVS
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.643 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.645 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:40.649 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:a0:3b 10.100.0.3'], port_security=['fa:16:3e:30:a0:3b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b870f828-e429-4acb-8457-dd2521c13114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '8', 'neutron:security_group_ids': '882b8133-50bb-4df0-b17c-f8a6f2c6d8a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb1fb74-fc05-473e-b6a7-f2e41e415ed2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=4fbe252d-e231-4421-9a71-f8470765731a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:05:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:40.650 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 4fbe252d-e231-4421-9a71-f8470765731a in datapath 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 unbound from our chassis
Nov 24 02:05:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:40.651 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:05:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:40.652 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[d9dbabfe-1aa7-44ea-9a02-9bdb139e5601]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:40 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:40.653 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 namespace which is not needed anymore
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.664 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 24 02:05:40 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.169s CPU time.
Nov 24 02:05:40 compute-0 systemd-machined[153319]: Machine qemu-11-instance-0000000b terminated.
Nov 24 02:05:40 compute-0 NetworkManager[55458]: <info>  [1763949940.8285] manager: (tap4fbe252d-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [NOTICE]   (218264) : haproxy version is 2.8.14-c23fe91
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [NOTICE]   (218264) : path to executable is /usr/sbin/haproxy
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [WARNING]  (218264) : Exiting Master process...
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [WARNING]  (218264) : Exiting Master process...
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.873 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [ALERT]    (218264) : Current worker (218266) exited with code 143 (Terminated)
Nov 24 02:05:40 compute-0 neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737[218260]: [WARNING]  (218264) : All workers exited. Exiting... (0)
Nov 24 02:05:40 compute-0 podman[218632]: 2025-11-24 02:05:40.888162629 +0000 UTC m=+0.092931015 container died aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:05:40 compute-0 systemd[1]: libpod-aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8.scope: Deactivated successfully.
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.916 187003 INFO nova.virt.libvirt.driver [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance destroyed successfully.
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.917 187003 DEBUG nova.objects.instance [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid b870f828-e429-4acb-8457-dd2521c13114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8-userdata-shm.mount: Deactivated successfully.
Nov 24 02:05:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-d253a647779153898389aa5cb499205007e06db27cbe41cb63df881fccfed4c3-merged.mount: Deactivated successfully.
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.929 187003 DEBUG nova.virt.libvirt.vif [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:04:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1543136382',display_name='tempest-TestNetworkBasicOps-server-1543136382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1543136382',id=11,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSP5FYoevblVVxr+BlHTysiWoOrbVN2UVjIu6ow/i8LBB5RNm/LYmCpco9bNSaiFRAxNFEdqZvYlD2+9SJuOtadsfugvNA6DYV5TI4dIdeRKmrGhNemySVx7Nw/dvl0WA==',key_name='tempest-TestNetworkBasicOps-14717754',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:04:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-foj6r87p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:04:57Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=b870f828-e429-4acb-8457-dd2521c13114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.931 187003 DEBUG nova.network.os_vif_util [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "4fbe252d-e231-4421-9a71-f8470765731a", "address": "fa:16:3e:30:a0:3b", "network": {"id": "67be9a0e-0da1-48ec-8b2b-8b93cf4e1737", "bridge": "br-int", "label": "tempest-network-smoke--1689384760", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fbe252d-e2", "ovs_interfaceid": "4fbe252d-e231-4421-9a71-f8470765731a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.932 187003 DEBUG nova.network.os_vif_util [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.933 187003 DEBUG os_vif [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.935 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 podman[218632]: 2025-11-24 02:05:40.935489335 +0000 UTC m=+0.140257691 container cleanup aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.935 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fbe252d-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.938 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.940 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.943 187003 INFO os_vif [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:a0:3b,bridge_name='br-int',has_traffic_filtering=True,id=4fbe252d-e231-4421-9a71-f8470765731a,network=Network(67be9a0e-0da1-48ec-8b2b-8b93cf4e1737),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fbe252d-e2')
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.944 187003 INFO nova.virt.libvirt.driver [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Deleting instance files /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114_del
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.944 187003 INFO nova.virt.libvirt.driver [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Deletion of /var/lib/nova/instances/b870f828-e429-4acb-8457-dd2521c13114_del complete
Nov 24 02:05:40 compute-0 systemd[1]: libpod-conmon-aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8.scope: Deactivated successfully.
Nov 24 02:05:40 compute-0 podman[218654]: 2025-11-24 02:05:40.979356164 +0000 UTC m=+0.074372015 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.993 187003 INFO nova.compute.manager [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.994 187003 DEBUG oslo.service.loopingcall [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.995 187003 DEBUG nova.compute.manager [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:05:40 compute-0 nova_compute[186999]: 2025-11-24 02:05:40.996 187003 DEBUG nova.network.neutron [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:05:41 compute-0 podman[218687]: 2025-11-24 02:05:41.011656229 +0000 UTC m=+0.046998778 container remove aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.017 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[780eed35-27f7-4463-8159-642fdd8c0a5d]: (4, ('Mon Nov 24 02:05:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 (aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8)\naa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8\nMon Nov 24 02:05:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 (aa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8)\naa58181e7c2d7d1472f4e5c8ee73e23edade3a550ffcc2bc4e29052c4b18fcd8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.019 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[76e6848e-7af5-4de3-b85e-1b54bb5e8f95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.020 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap67be9a0e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.022 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:41 compute-0 kernel: tap67be9a0e-00: left promiscuous mode
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.027 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfc6fa4-3698-4bd3-8bcb-5003a78b1cf7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.035 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.043 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf48e19-103a-48f9-aa56-8e52f26b6ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.045 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[cc00dab0-ea93-49fe-8394-10c57fe26742]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.063 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e46e23f0-9bc6-4842-ac2c-c000d15e8a58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336355, 'reachable_time': 41325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218711, 'error': None, 'target': 'ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d67be9a0e\x2d0da1\x2d48ec\x2d8b2b\x2d8b93cf4e1737.mount: Deactivated successfully.
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.069 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-67be9a0e-0da1-48ec-8b2b-8b93cf4e1737 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:05:41 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:41.069 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[13851ba0-faf6-4c0a-a3fb-3a4bbe823263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.493 187003 DEBUG nova.network.neutron [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.517 187003 INFO nova.compute.manager [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] Took 0.52 seconds to deallocate network for instance.
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.563 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.563 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.618 187003 DEBUG nova.compute.provider_tree [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.630 187003 DEBUG nova.scheduler.client.report [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.648 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.675 187003 INFO nova.scheduler.client.report [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance b870f828-e429-4acb-8457-dd2521c13114
Nov 24 02:05:41 compute-0 nova_compute[186999]: 2025-11-24 02:05:41.730 187003 DEBUG oslo_concurrency.lockutils [None req-6a2ff8aa-373f-4f77-9fcc-ed7c49d5a156 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.593 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-changed-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.594 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing instance network info cache due to event network-changed-4fbe252d-e231-4421-9a71-f8470765731a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.594 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.594 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.594 187003 DEBUG nova.network.neutron [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Refreshing network info cache for port 4fbe252d-e231-4421-9a71-f8470765731a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.654 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:42 compute-0 nova_compute[186999]: 2025-11-24 02:05:42.707 187003 DEBUG nova.network.neutron [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.268 187003 DEBUG nova.network.neutron [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.269 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-b870f828-e429-4acb-8457-dd2521c13114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.269 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.269 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.269 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.270 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.270 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.270 187003 WARNING nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-unplugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state deleted and task_state None.
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.271 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.271 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "b870f828-e429-4acb-8457-dd2521c13114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.271 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.271 187003 DEBUG oslo_concurrency.lockutils [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "b870f828-e429-4acb-8457-dd2521c13114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.271 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] No waiting events found dispatching network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.272 187003 WARNING nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received unexpected event network-vif-plugged-4fbe252d-e231-4421-9a71-f8470765731a for instance with vm_state deleted and task_state None.
Nov 24 02:05:43 compute-0 nova_compute[186999]: 2025-11-24 02:05:43.272 187003 DEBUG nova.compute.manager [req-896829ae-8b52-4929-88a6-6a9cc593a706 req-5215d995-8c29-41e1-9593-f9345652e5e9 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: b870f828-e429-4acb-8457-dd2521c13114] Received event network-vif-deleted-4fbe252d-e231-4421-9a71-f8470765731a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:45 compute-0 nova_compute[186999]: 2025-11-24 02:05:45.401 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:45 compute-0 nova_compute[186999]: 2025-11-24 02:05:45.509 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:45 compute-0 podman[218713]: 2025-11-24 02:05:45.817124623 +0000 UTC m=+0.061066482 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 02:05:45 compute-0 nova_compute[186999]: 2025-11-24 02:05:45.939 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:47 compute-0 nova_compute[186999]: 2025-11-24 02:05:47.657 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:48.338 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:05:48 compute-0 nova_compute[186999]: 2025-11-24 02:05:48.339 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:48.341 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:05:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:48.427 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:49 compute-0 podman[218733]: 2025-11-24 02:05:49.83003078 +0000 UTC m=+0.073504520 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 02:05:49 compute-0 podman[218734]: 2025-11-24 02:05:49.843111547 +0000 UTC m=+0.084399056 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:05:49 compute-0 podman[218735]: 2025-11-24 02:05:49.859759593 +0000 UTC m=+0.098375757 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 24 02:05:50 compute-0 nova_compute[186999]: 2025-11-24 02:05:50.972 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:52 compute-0 nova_compute[186999]: 2025-11-24 02:05:52.658 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:53 compute-0 nova_compute[186999]: 2025-11-24 02:05:53.377 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949938.3752308, 5ddfb970-cf5c-460e-abac-d0f07ffe05c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:05:53 compute-0 nova_compute[186999]: 2025-11-24 02:05:53.377 187003 INFO nova.compute.manager [-] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] VM Stopped (Lifecycle Event)
Nov 24 02:05:53 compute-0 nova_compute[186999]: 2025-11-24 02:05:53.398 187003 DEBUG nova.compute.manager [None req-636e3c35-6341-4633-834f-32b7f0c2a2b1 - - - - - -] [instance: 5ddfb970-cf5c-460e-abac-d0f07ffe05c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:54 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:54.344 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.784 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.785 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.799 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.873 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.873 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.879 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.880 187003 INFO nova.compute.claims [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Claim successful on node compute-0.ctlplane.example.com
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.914 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949940.9124777, b870f828-e429-4acb-8457-dd2521c13114 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.914 187003 INFO nova.compute.manager [-] [instance: b870f828-e429-4acb-8457-dd2521c13114] VM Stopped (Lifecycle Event)
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.930 187003 DEBUG nova.compute.manager [None req-e5555e60-9e5e-4cc7-9842-fcaae9a71457 - - - - - -] [instance: b870f828-e429-4acb-8457-dd2521c13114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.972 187003 DEBUG nova.compute.provider_tree [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.974 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:55 compute-0 nova_compute[186999]: 2025-11-24 02:05:55.982 187003 DEBUG nova.scheduler.client.report [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.017 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.018 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.072 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.073 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.088 187003 INFO nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.104 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.298 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.299 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.300 187003 INFO nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Creating image(s)
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.300 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.300 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.301 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.312 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.332 187003 DEBUG nova.policy [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6a149dd53b548a9bac30b99c4c1141f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.372 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.373 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.374 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.390 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.452 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.454 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.496 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1,backing_fmt=raw /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.497 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "fd785949e251ccc7b996d002b8cf9a82e50736d1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.497 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.569 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fd785949e251ccc7b996d002b8cf9a82e50736d1 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.570 187003 DEBUG nova.virt.disk.api [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Checking if we can resize image /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.571 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.631 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.632 187003 DEBUG nova.virt.disk.api [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Cannot resize image /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.633 187003 DEBUG nova.objects.instance [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'migration_context' on Instance uuid ef286a52-dd45-4442-8b3e-46d42c8631b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.646 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.647 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Ensure instance console log exists: /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.648 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.648 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.648 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:56 compute-0 nova_compute[186999]: 2025-11-24 02:05:56.992 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Successfully created port: 03d8223f-1775-404e-821d-bb39489ef176 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.502 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Successfully updated port: 03d8223f-1775-404e-821d-bb39489ef176 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.519 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.519 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquired lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.519 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.600 187003 DEBUG nova.compute.manager [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-changed-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.600 187003 DEBUG nova.compute.manager [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing instance network info cache due to event network-changed-03d8223f-1775-404e-821d-bb39489ef176. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.600 187003 DEBUG oslo_concurrency.lockutils [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.657 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 02:05:57 compute-0 nova_compute[186999]: 2025-11-24 02:05:57.660 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.793 187003 DEBUG nova.network.neutron [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updating instance_info_cache with network_info: [{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.812 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Releasing lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.813 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Instance network_info: |[{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.813 187003 DEBUG oslo_concurrency.lockutils [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.813 187003 DEBUG nova.network.neutron [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing network info cache for port 03d8223f-1775-404e-821d-bb39489ef176 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.816 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Start _get_guest_xml network_info=[{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'image_id': 'b6697012-8086-43d5-999a-6bb711240eaa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.821 187003 WARNING nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.829 187003 DEBUG nova.virt.libvirt.host [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.829 187003 DEBUG nova.virt.libvirt.host [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.833 187003 DEBUG nova.virt.libvirt.host [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.834 187003 DEBUG nova.virt.libvirt.host [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.834 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.834 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T01:56:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1e8dafc-0e0f-4b06-ab61-2691966769fd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T01:56:22Z,direct_url=<?>,disk_format='qcow2',id=b6697012-8086-43d5-999a-6bb711240eaa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f637cef21e80464abf1687ea895028cc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T01:56:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.835 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.835 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.835 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.836 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.836 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.836 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.836 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.837 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.837 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.837 187003 DEBUG nova.virt.hardware [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.841 187003 DEBUG nova.virt.libvirt.vif [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-336978854',display_name='tempest-TestNetworkBasicOps-server-336978854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-336978854',id=13,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKkaacrDvedxPMbuGbNB4D+7GmKG0jSP7UwyE9UdwzKvUml6pJP3WUTp8lVYy2EhEJc//Gz5RzdUUP/BIfjN/ctbdTSN2Mov9GIxsfSGWW7wkmHs1TpD4yrmDubARlS91Q==',key_name='tempest-TestNetworkBasicOps-1016592015',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-d1pnd1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:05:56Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=ef286a52-dd45-4442-8b3e-46d42c8631b0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.841 187003 DEBUG nova.network.os_vif_util [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.842 187003 DEBUG nova.network.os_vif_util [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.843 187003 DEBUG nova.objects.instance [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef286a52-dd45-4442-8b3e-46d42c8631b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.854 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] End _get_guest_xml xml=<domain type="kvm">
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <uuid>ef286a52-dd45-4442-8b3e-46d42c8631b0</uuid>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <name>instance-0000000d</name>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <memory>131072</memory>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <vcpu>1</vcpu>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <metadata>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:name>tempest-TestNetworkBasicOps-server-336978854</nova:name>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:creationTime>2025-11-24 02:05:58</nova:creationTime>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:flavor name="m1.nano">
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:memory>128</nova:memory>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:disk>1</nova:disk>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:swap>0</nova:swap>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:vcpus>1</nova:vcpus>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       </nova:flavor>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:owner>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:user uuid="e6a149dd53b548a9bac30b99c4c1141f">tempest-TestNetworkBasicOps-1968666008-project-member</nova:user>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:project uuid="b76cc415e6cf44a196b0c059e9c5a880">tempest-TestNetworkBasicOps-1968666008</nova:project>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       </nova:owner>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:root type="image" uuid="b6697012-8086-43d5-999a-6bb711240eaa"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <nova:ports>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         <nova:port uuid="03d8223f-1775-404e-821d-bb39489ef176">
Nov 24 02:05:58 compute-0 nova_compute[186999]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:         </nova:port>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       </nova:ports>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </nova:instance>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </metadata>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <sysinfo type="smbios">
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <system>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="manufacturer">RDO</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="product">OpenStack Compute</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="serial">ef286a52-dd45-4442-8b3e-46d42c8631b0</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="uuid">ef286a52-dd45-4442-8b3e-46d42c8631b0</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <entry name="family">Virtual Machine</entry>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </system>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </sysinfo>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <os>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <boot dev="hd"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <smbios mode="sysinfo"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </os>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <features>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <acpi/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <apic/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <vmcoreinfo/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </features>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <clock offset="utc">
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <timer name="hpet" present="no"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </clock>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <cpu mode="host-model" match="exact">
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </cpu>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   <devices>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <disk type="file" device="disk">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <target dev="vda" bus="virtio"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <disk type="file" device="cdrom">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <source file="/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.config"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <target dev="sda" bus="sata"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </disk>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <interface type="ethernet">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <mac address="fa:16:3e:6c:46:27"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <mtu size="1442"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <target dev="tap03d8223f-17"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </interface>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <serial type="pty">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <log file="/var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/console.log" append="off"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </serial>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <video>
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <model type="virtio"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </video>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <input type="tablet" bus="usb"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <rng model="virtio">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <backend model="random">/dev/urandom</backend>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </rng>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <controller type="usb" index="0"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     <memballoon model="virtio">
Nov 24 02:05:58 compute-0 nova_compute[186999]:       <stats period="10"/>
Nov 24 02:05:58 compute-0 nova_compute[186999]:     </memballoon>
Nov 24 02:05:58 compute-0 nova_compute[186999]:   </devices>
Nov 24 02:05:58 compute-0 nova_compute[186999]: </domain>
Nov 24 02:05:58 compute-0 nova_compute[186999]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.855 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Preparing to wait for external event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.856 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.857 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.857 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.859 187003 DEBUG nova.virt.libvirt.vif [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T02:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-336978854',display_name='tempest-TestNetworkBasicOps-server-336978854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-336978854',id=13,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKkaacrDvedxPMbuGbNB4D+7GmKG0jSP7UwyE9UdwzKvUml6pJP3WUTp8lVYy2EhEJc//Gz5RzdUUP/BIfjN/ctbdTSN2Mov9GIxsfSGWW7wkmHs1TpD4yrmDubARlS91Q==',key_name='tempest-TestNetworkBasicOps-1016592015',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-d1pnd1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T02:05:56Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=ef286a52-dd45-4442-8b3e-46d42c8631b0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.859 187003 DEBUG nova.network.os_vif_util [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.861 187003 DEBUG nova.network.os_vif_util [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.861 187003 DEBUG os_vif [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.863 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.864 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.865 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.869 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.870 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03d8223f-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.871 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03d8223f-17, col_values=(('external_ids', {'iface-id': '03d8223f-1775-404e-821d-bb39489ef176', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:46:27', 'vm-uuid': 'ef286a52-dd45-4442-8b3e-46d42c8631b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.873 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:58 compute-0 NetworkManager[55458]: <info>  [1763949958.8742] manager: (tap03d8223f-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.876 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.881 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.883 187003 INFO os_vif [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17')
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.932 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.933 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.933 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] No VIF found with MAC fa:16:3e:6c:46:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 02:05:58 compute-0 nova_compute[186999]: 2025-11-24 02:05:58.934 187003 INFO nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Using config drive
Nov 24 02:05:59 compute-0 nova_compute[186999]: 2025-11-24 02:05:59.770 187003 INFO nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Creating config drive at /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.config
Nov 24 02:05:59 compute-0 nova_compute[186999]: 2025-11-24 02:05:59.775 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsyomprbd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:05:59 compute-0 nova_compute[186999]: 2025-11-24 02:05:59.898 187003 DEBUG oslo_concurrency.processutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsyomprbd" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:05:59 compute-0 kernel: tap03d8223f-17: entered promiscuous mode
Nov 24 02:05:59 compute-0 ovn_controller[95380]: 2025-11-24T02:05:59Z|00154|binding|INFO|Claiming lport 03d8223f-1775-404e-821d-bb39489ef176 for this chassis.
Nov 24 02:05:59 compute-0 ovn_controller[95380]: 2025-11-24T02:05:59Z|00155|binding|INFO|03d8223f-1775-404e-821d-bb39489ef176: Claiming fa:16:3e:6c:46:27 10.100.0.7
Nov 24 02:05:59 compute-0 NetworkManager[55458]: <info>  [1763949959.9641] manager: (tap03d8223f-17): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 24 02:05:59 compute-0 nova_compute[186999]: 2025-11-24 02:05:59.963 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:59 compute-0 nova_compute[186999]: 2025-11-24 02:05:59.966 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:05:59 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:59.980 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:46:27 10.100.0.7'], port_security=['fa:16:3e:6c:46:27 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ef286a52-dd45-4442-8b3e-46d42c8631b0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31230271-4ba3-4910-8ba5-b0ec3e937889', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf1c156d-e9b6-4e89-9d2a-00a5938fcf1c, chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=03d8223f-1775-404e-821d-bb39489ef176) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:05:59 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:59.982 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 03d8223f-1775-404e-821d-bb39489ef176 in datapath c0b13e60-fb71-46a2-bea2-f40cc84e06ed bound to our chassis
Nov 24 02:05:59 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:59.983 104238 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c0b13e60-fb71-46a2-bea2-f40cc84e06ed
Nov 24 02:05:59 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:59.997 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[e8cf10c9-88a2-4ba9-ae17-01696bdee06b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:05:59 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:05:59.998 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc0b13e60-f1 in ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 02:05:59 compute-0 systemd-udevd[218832]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.002 213256 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc0b13e60-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.002 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd11a38-bb66-435e-9cf1-0a86d304a080]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.003 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[139f7af1-1847-463d-b850-84f9fa03e0bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 systemd-machined[153319]: New machine qemu-13-instance-0000000d.
Nov 24 02:06:00 compute-0 NetworkManager[55458]: <info>  [1763949960.0135] device (tap03d8223f-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 02:06:00 compute-0 NetworkManager[55458]: <info>  [1763949960.0144] device (tap03d8223f-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.021 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.023 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[505452ba-4c52-48ec-b75d-4d02d8194fe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_controller[95380]: 2025-11-24T02:06:00Z|00156|binding|INFO|Setting lport 03d8223f-1775-404e-821d-bb39489ef176 ovn-installed in OVS
Nov 24 02:06:00 compute-0 ovn_controller[95380]: 2025-11-24T02:06:00Z|00157|binding|INFO|Setting lport 03d8223f-1775-404e-821d-bb39489ef176 up in Southbound
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.029 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:00 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.050 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[52a664ea-f9b9-40b8-8cca-00b742160594]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.084 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[9261cf72-f295-4c63-b2d0-0e5419cc2805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.089 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[c8eeb21e-cf48-45a6-b5e0-6912c25821d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 NetworkManager[55458]: <info>  [1763949960.0902] manager: (tapc0b13e60-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 24 02:06:00 compute-0 systemd-udevd[218836]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.121 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[aef5d111-cca9-405d-be6d-15292a3bc577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.124 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[86f81cac-0bf2-4797-b03c-8c178b105a58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 NetworkManager[55458]: <info>  [1763949960.1494] device (tapc0b13e60-f0): carrier: link connected
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.155 213319 DEBUG oslo.privsep.daemon [-] privsep: reply[0c41b601-e910-424a-93a8-10fead263a01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.174 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc98d43-01dc-4a46-b259-05d543f399a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc0b13e60-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:03:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342681, 'reachable_time': 40015, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218865, 'error': None, 'target': 'ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.192 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca5a1d5-446d-4bde-a1eb-2a201bb6ba4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:346'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 342681, 'tstamp': 342681}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218866, 'error': None, 'target': 'ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.214 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[32843a0b-2414-476d-9898-bc9814d36f36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc0b13e60-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:03:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342681, 'reachable_time': 40015, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218867, 'error': None, 'target': 'ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.249 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[1118e434-5b16-41e5-b97a-2f7a983145e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.294 187003 DEBUG nova.compute.manager [req-866f0f4d-0c99-4fa6-b01f-ad19d125c843 req-d92291a2-2cab-460d-a0a9-936b58ca2943 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.295 187003 DEBUG oslo_concurrency.lockutils [req-866f0f4d-0c99-4fa6-b01f-ad19d125c843 req-d92291a2-2cab-460d-a0a9-936b58ca2943 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.295 187003 DEBUG oslo_concurrency.lockutils [req-866f0f4d-0c99-4fa6-b01f-ad19d125c843 req-d92291a2-2cab-460d-a0a9-936b58ca2943 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.295 187003 DEBUG oslo_concurrency.lockutils [req-866f0f4d-0c99-4fa6-b01f-ad19d125c843 req-d92291a2-2cab-460d-a0a9-936b58ca2943 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.296 187003 DEBUG nova.compute.manager [req-866f0f4d-0c99-4fa6-b01f-ad19d125c843 req-d92291a2-2cab-460d-a0a9-936b58ca2943 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Processing event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.309 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.310 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949960.3082657, ef286a52-dd45-4442-8b3e-46d42c8631b0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.310 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] VM Started (Lifecycle Event)
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.316 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0c4baf-1908-47a1-85ba-d9466ed5e945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.318 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0b13e60-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.318 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.318 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0b13e60-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:06:00 compute-0 NetworkManager[55458]: <info>  [1763949960.3213] manager: (tapc0b13e60-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 24 02:06:00 compute-0 kernel: tapc0b13e60-f0: entered promiscuous mode
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.320 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.322 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.325 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc0b13e60-f0, col_values=(('external_ids', {'iface-id': 'f5993a10-a523-425a-aca7-77bbf85aa188'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:06:00 compute-0 ovn_controller[95380]: 2025-11-24T02:06:00Z|00158|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.328 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.328 104238 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c0b13e60-fb71-46a2-bea2-f40cc84e06ed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c0b13e60-fb71-46a2-bea2-f40cc84e06ed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.328 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.329 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[7168ebe9-c7a5-46b6-9d02-50055c4178a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.330 104238 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: global
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     log         /dev/log local0 debug
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     log-tag     haproxy-metadata-proxy-c0b13e60-fb71-46a2-bea2-f40cc84e06ed
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     user        root
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     group       root
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     maxconn     1024
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     pidfile     /var/lib/neutron/external/pids/c0b13e60-fb71-46a2-bea2-f40cc84e06ed.pid.haproxy
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     daemon
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: defaults
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     log global
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     mode http
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     option httplog
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     option dontlognull
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     option http-server-close
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     option forwardfor
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     retries                 3
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     timeout http-request    30s
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     timeout connect         30s
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     timeout client          32s
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     timeout server          32s
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     timeout http-keep-alive 30s
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: listen listener
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     bind 169.254.169.254:80
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:     http-request add-header X-OVN-Network-ID c0b13e60-fb71-46a2-bea2-f40cc84e06ed
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 02:06:00 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:00.331 104238 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'env', 'PROCESS_TAG=haproxy-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c0b13e60-fb71-46a2-bea2-f40cc84e06ed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.336 187003 INFO nova.virt.libvirt.driver [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Instance spawned successfully.
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.336 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.339 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.340 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.360 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.360 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949960.3084068, ef286a52-dd45-4442-8b3e-46d42c8631b0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.360 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] VM Paused (Lifecycle Event)
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.366 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.367 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.367 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.368 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.368 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.369 187003 DEBUG nova.virt.libvirt.driver [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.377 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.381 187003 DEBUG nova.virt.driver [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] Emitting event <LifecycleEvent: 1763949960.317399, ef286a52-dd45-4442-8b3e-46d42c8631b0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.381 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] VM Resumed (Lifecycle Event)
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.405 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.409 187003 DEBUG nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.439 187003 INFO nova.compute.manager [None req-b64a34ce-de72-4cb3-bec0-89f088b5d99f - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.449 187003 INFO nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Took 4.15 seconds to spawn the instance on the hypervisor.
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.450 187003 DEBUG nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.514 187003 INFO nova.compute.manager [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Took 4.67 seconds to build instance.
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.529 187003 DEBUG oslo_concurrency.lockutils [None req-e9caaa65-1261-456e-b77b-3f1ed378ce56 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:00 compute-0 podman[218906]: 2025-11-24 02:06:00.745188506 +0000 UTC m=+0.055818095 container create 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:06:00 compute-0 systemd[1]: Started libpod-conmon-19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e.scope.
Nov 24 02:06:00 compute-0 podman[218906]: 2025-11-24 02:06:00.711164862 +0000 UTC m=+0.021794481 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 02:06:00 compute-0 systemd[1]: Started libcrun container.
Nov 24 02:06:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2acd9b88eccb7f4a1e2d7426a77f12d8dbb201f61d522100afbd024920620e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.823 187003 DEBUG nova.network.neutron [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updated VIF entry in instance network info cache for port 03d8223f-1775-404e-821d-bb39489ef176. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.824 187003 DEBUG nova.network.neutron [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updating instance_info_cache with network_info: [{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:06:00 compute-0 podman[218906]: 2025-11-24 02:06:00.825948158 +0000 UTC m=+0.136577757 container init 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 02:06:00 compute-0 podman[218906]: 2025-11-24 02:06:00.834277122 +0000 UTC m=+0.144906711 container start 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 24 02:06:00 compute-0 nova_compute[186999]: 2025-11-24 02:06:00.838 187003 DEBUG oslo_concurrency.lockutils [req-fe18666a-7f2e-44af-8c16-e394f34bf345 req-928b1da2-9aa8-408b-90a9-4dca8797b42d 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:06:00 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [NOTICE]   (218925) : New worker (218927) forked
Nov 24 02:06:00 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [NOTICE]   (218925) : Loading success.
Nov 24 02:06:01 compute-0 podman[218936]: 2025-11-24 02:06:01.813851425 +0000 UTC m=+0.064319213 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.367 187003 DEBUG nova.compute.manager [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.368 187003 DEBUG oslo_concurrency.lockutils [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.368 187003 DEBUG oslo_concurrency.lockutils [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.368 187003 DEBUG oslo_concurrency.lockutils [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.368 187003 DEBUG nova.compute.manager [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] No waiting events found dispatching network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.368 187003 WARNING nova.compute.manager [req-d4883119-fd74-469e-8e75-8f42e375f576 req-275232bc-2a70-4f5d-98b6-ed0f715402ee 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received unexpected event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 for instance with vm_state active and task_state None.
Nov 24 02:06:02 compute-0 nova_compute[186999]: 2025-11-24 02:06:02.662 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:03 compute-0 ovn_controller[95380]: 2025-11-24T02:06:03Z|00159|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:03 compute-0 NetworkManager[55458]: <info>  [1763949963.8086] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 24 02:06:03 compute-0 nova_compute[186999]: 2025-11-24 02:06:03.807 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:03 compute-0 NetworkManager[55458]: <info>  [1763949963.8096] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 24 02:06:03 compute-0 ovn_controller[95380]: 2025-11-24T02:06:03Z|00160|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:03 compute-0 nova_compute[186999]: 2025-11-24 02:06:03.848 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:03 compute-0 nova_compute[186999]: 2025-11-24 02:06:03.854 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:03 compute-0 nova_compute[186999]: 2025-11-24 02:06:03.872 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:04 compute-0 nova_compute[186999]: 2025-11-24 02:06:04.439 187003 DEBUG nova.compute.manager [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-changed-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:04 compute-0 nova_compute[186999]: 2025-11-24 02:06:04.440 187003 DEBUG nova.compute.manager [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing instance network info cache due to event network-changed-03d8223f-1775-404e-821d-bb39489ef176. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:06:04 compute-0 nova_compute[186999]: 2025-11-24 02:06:04.440 187003 DEBUG oslo_concurrency.lockutils [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:06:04 compute-0 nova_compute[186999]: 2025-11-24 02:06:04.440 187003 DEBUG oslo_concurrency.lockutils [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:06:04 compute-0 nova_compute[186999]: 2025-11-24 02:06:04.440 187003 DEBUG nova.network.neutron [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing network info cache for port 03d8223f-1775-404e-821d-bb39489ef176 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:06:05 compute-0 nova_compute[186999]: 2025-11-24 02:06:05.484 187003 DEBUG nova.network.neutron [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updated VIF entry in instance network info cache for port 03d8223f-1775-404e-821d-bb39489ef176. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:06:05 compute-0 nova_compute[186999]: 2025-11-24 02:06:05.485 187003 DEBUG nova.network.neutron [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updating instance_info_cache with network_info: [{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:06:05 compute-0 nova_compute[186999]: 2025-11-24 02:06:05.502 187003 DEBUG oslo_concurrency.lockutils [req-6e8c1967-be8a-44a3-afe7-a88e08dd772e req-573e87aa-65b9-410e-81a2-723178a378e7 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:06:06 compute-0 podman[218958]: 2025-11-24 02:06:06.821869973 +0000 UTC m=+0.070593998 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 02:06:07 compute-0 nova_compute[186999]: 2025-11-24 02:06:07.665 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:08 compute-0 nova_compute[186999]: 2025-11-24 02:06:08.875 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:10 compute-0 nova_compute[186999]: 2025-11-24 02:06:10.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:11 compute-0 ovn_controller[95380]: 2025-11-24T02:06:11Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:46:27 10.100.0.7
Nov 24 02:06:11 compute-0 ovn_controller[95380]: 2025-11-24T02:06:11Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:46:27 10.100.0.7
Nov 24 02:06:11 compute-0 podman[218988]: 2025-11-24 02:06:11.824892111 +0000 UTC m=+0.079988152 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:06:12 compute-0 nova_compute[186999]: 2025-11-24 02:06:12.667 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:12 compute-0 nova_compute[186999]: 2025-11-24 02:06:12.766 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:13 compute-0 nova_compute[186999]: 2025-11-24 02:06:13.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:13 compute-0 nova_compute[186999]: 2025-11-24 02:06:13.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:06:13 compute-0 nova_compute[186999]: 2025-11-24 02:06:13.794 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:06:13 compute-0 nova_compute[186999]: 2025-11-24 02:06:13.879 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:16 compute-0 podman[219012]: 2025-11-24 02:06:16.826848879 +0000 UTC m=+0.082794830 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 02:06:17 compute-0 nova_compute[186999]: 2025-11-24 02:06:17.670 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:17 compute-0 nova_compute[186999]: 2025-11-24 02:06:17.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:17 compute-0 nova_compute[186999]: 2025-11-24 02:06:17.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.468 187003 INFO nova.compute.manager [None req-2c466ae8-c081-48c3-a604-322f3ae37292 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Get console output
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.474 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.795 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.797 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.872 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.893 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.939 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:06:18 compute-0 nova_compute[186999]: 2025-11-24 02:06:18.940 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.000 187003 DEBUG oslo_concurrency.processutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.162 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.163 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5594MB free_disk=73.4267463684082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.164 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.164 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.262 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Instance ef286a52-dd45-4442-8b3e-46d42c8631b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.263 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.263 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.313 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.328 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.351 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:06:19 compute-0 nova_compute[186999]: 2025-11-24 02:06:19.351 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:20 compute-0 nova_compute[186999]: 2025-11-24 02:06:20.353 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:20 compute-0 podman[219040]: 2025-11-24 02:06:20.832016201 +0000 UTC m=+0.074337374 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 02:06:20 compute-0 podman[219041]: 2025-11-24 02:06:20.846822026 +0000 UTC m=+0.086311759 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:06:20 compute-0 podman[219042]: 2025-11-24 02:06:20.869958864 +0000 UTC m=+0.105567469 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 02:06:22 compute-0 nova_compute[186999]: 2025-11-24 02:06:22.673 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:22 compute-0 nova_compute[186999]: 2025-11-24 02:06:22.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:06:22 compute-0 nova_compute[186999]: 2025-11-24 02:06:22.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:06:22 compute-0 ovn_controller[95380]: 2025-11-24T02:06:22Z|00161|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:22 compute-0 nova_compute[186999]: 2025-11-24 02:06:22.840 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:22 compute-0 ovn_controller[95380]: 2025-11-24T02:06:22Z|00162|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:22 compute-0 nova_compute[186999]: 2025-11-24 02:06:22.891 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:23 compute-0 nova_compute[186999]: 2025-11-24 02:06:23.897 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:24 compute-0 nova_compute[186999]: 2025-11-24 02:06:24.921 187003 INFO nova.compute.manager [None req-412655e7-10c1-441e-8fd0-843bf8427032 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Get console output
Nov 24 02:06:24 compute-0 nova_compute[186999]: 2025-11-24 02:06:24.928 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:06:25 compute-0 nova_compute[186999]: 2025-11-24 02:06:25.417 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:25 compute-0 NetworkManager[55458]: <info>  [1763949985.4193] manager: (patch-provnet-e1173034-69f5-4892-8572-81d0734617e4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 24 02:06:25 compute-0 NetworkManager[55458]: <info>  [1763949985.4206] manager: (patch-br-int-to-provnet-e1173034-69f5-4892-8572-81d0734617e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 24 02:06:25 compute-0 ovn_controller[95380]: 2025-11-24T02:06:25Z|00163|binding|INFO|Releasing lport f5993a10-a523-425a-aca7-77bbf85aa188 from this chassis (sb_readonly=0)
Nov 24 02:06:25 compute-0 nova_compute[186999]: 2025-11-24 02:06:25.463 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:25 compute-0 nova_compute[186999]: 2025-11-24 02:06:25.469 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:25 compute-0 nova_compute[186999]: 2025-11-24 02:06:25.674 187003 INFO nova.compute.manager [None req-ca82f49b-5358-4527-84b0-a69e8c34baeb e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Get console output
Nov 24 02:06:25 compute-0 nova_compute[186999]: 2025-11-24 02:06:25.680 213157 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.450 187003 DEBUG nova.compute.manager [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-changed-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.451 187003 DEBUG nova.compute.manager [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing instance network info cache due to event network-changed-03d8223f-1775-404e-821d-bb39489ef176. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.451 187003 DEBUG oslo_concurrency.lockutils [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.452 187003 DEBUG oslo_concurrency.lockutils [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquired lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.452 187003 DEBUG nova.network.neutron [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Refreshing network info cache for port 03d8223f-1775-404e-821d-bb39489ef176 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.505 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.505 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.506 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.507 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.507 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.509 187003 INFO nova.compute.manager [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Terminating instance
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.511 187003 DEBUG nova.compute.manager [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 02:06:26 compute-0 kernel: tap03d8223f-17 (unregistering): left promiscuous mode
Nov 24 02:06:26 compute-0 NetworkManager[55458]: <info>  [1763949986.5455] device (tap03d8223f-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.563 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 ovn_controller[95380]: 2025-11-24T02:06:26Z|00164|binding|INFO|Releasing lport 03d8223f-1775-404e-821d-bb39489ef176 from this chassis (sb_readonly=0)
Nov 24 02:06:26 compute-0 ovn_controller[95380]: 2025-11-24T02:06:26Z|00165|binding|INFO|Setting lport 03d8223f-1775-404e-821d-bb39489ef176 down in Southbound
Nov 24 02:06:26 compute-0 ovn_controller[95380]: 2025-11-24T02:06:26Z|00166|binding|INFO|Removing iface tap03d8223f-17 ovn-installed in OVS
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.566 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.573 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:46:27 10.100.0.7'], port_security=['fa:16:3e:6c:46:27 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ef286a52-dd45-4442-8b3e-46d42c8631b0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b76cc415e6cf44a196b0c059e9c5a880', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31230271-4ba3-4910-8ba5-b0ec3e937889', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf1c156d-e9b6-4e89-9d2a-00a5938fcf1c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>], logical_port=03d8223f-1775-404e-821d-bb39489ef176) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb2d836a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.575 104238 INFO neutron.agent.ovn.metadata.agent [-] Port 03d8223f-1775-404e-821d-bb39489ef176 in datapath c0b13e60-fb71-46a2-bea2-f40cc84e06ed unbound from our chassis
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.577 104238 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0b13e60-fb71-46a2-bea2-f40cc84e06ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.578 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[95df7602-8a47-4171-a9ac-7584debd61fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.579 104238 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed namespace which is not needed anymore
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.604 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 24 02:06:26 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.365s CPU time.
Nov 24 02:06:26 compute-0 systemd-machined[153319]: Machine qemu-13-instance-0000000d terminated.
Nov 24 02:06:26 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [NOTICE]   (218925) : haproxy version is 2.8.14-c23fe91
Nov 24 02:06:26 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [NOTICE]   (218925) : path to executable is /usr/sbin/haproxy
Nov 24 02:06:26 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [WARNING]  (218925) : Exiting Master process...
Nov 24 02:06:26 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [ALERT]    (218925) : Current worker (218927) exited with code 143 (Terminated)
Nov 24 02:06:26 compute-0 neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed[218921]: [WARNING]  (218925) : All workers exited. Exiting... (0)
Nov 24 02:06:26 compute-0 systemd[1]: libpod-19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e.scope: Deactivated successfully.
Nov 24 02:06:26 compute-0 podman[219135]: 2025-11-24 02:06:26.739587921 +0000 UTC m=+0.047055369 container died 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 02:06:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2acd9b88eccb7f4a1e2d7426a77f12d8dbb201f61d522100afbd024920620e1-merged.mount: Deactivated successfully.
Nov 24 02:06:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e-userdata-shm.mount: Deactivated successfully.
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.783 187003 INFO nova.virt.libvirt.driver [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Instance destroyed successfully.
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.784 187003 DEBUG nova.objects.instance [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lazy-loading 'resources' on Instance uuid ef286a52-dd45-4442-8b3e-46d42c8631b0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 02:06:26 compute-0 podman[219135]: 2025-11-24 02:06:26.792612777 +0000 UTC m=+0.100080225 container cleanup 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.799 187003 DEBUG nova.virt.libvirt.vif [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T02:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-336978854',display_name='tempest-TestNetworkBasicOps-server-336978854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-336978854',id=13,image_ref='b6697012-8086-43d5-999a-6bb711240eaa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKkaacrDvedxPMbuGbNB4D+7GmKG0jSP7UwyE9UdwzKvUml6pJP3WUTp8lVYy2EhEJc//Gz5RzdUUP/BIfjN/ctbdTSN2Mov9GIxsfSGWW7wkmHs1TpD4yrmDubARlS91Q==',key_name='tempest-TestNetworkBasicOps-1016592015',keypairs=<?>,launch_index=0,launched_at=2025-11-24T02:06:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b76cc415e6cf44a196b0c059e9c5a880',ramdisk_id='',reservation_id='r-d1pnd1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b6697012-8086-43d5-999a-6bb711240eaa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1968666008',owner_user_name='tempest-TestNetworkBasicOps-1968666008-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T02:06:00Z,user_data=None,user_id='e6a149dd53b548a9bac30b99c4c1141f',uuid=ef286a52-dd45-4442-8b3e-46d42c8631b0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.800 187003 DEBUG nova.network.os_vif_util [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converting VIF {"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 02:06:26 compute-0 systemd[1]: libpod-conmon-19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e.scope: Deactivated successfully.
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.800 187003 DEBUG nova.network.os_vif_util [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.801 187003 DEBUG os_vif [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.802 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.802 187003 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03d8223f-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.804 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.806 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.809 187003 INFO os_vif [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:46:27,bridge_name='br-int',has_traffic_filtering=True,id=03d8223f-1775-404e-821d-bb39489ef176,network=Network(c0b13e60-fb71-46a2-bea2-f40cc84e06ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03d8223f-17')
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.809 187003 INFO nova.virt.libvirt.driver [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Deleting instance files /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0_del
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.810 187003 INFO nova.virt.libvirt.driver [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Deletion of /var/lib/nova/instances/ef286a52-dd45-4442-8b3e-46d42c8631b0_del complete
Nov 24 02:06:26 compute-0 podman[219183]: 2025-11-24 02:06:26.854069149 +0000 UTC m=+0.037971155 container remove 19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.859 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[45365d96-f4e1-4010-a5f8-a6fc4acc4f65]: (4, ('Mon Nov 24 02:06:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed (19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e)\n19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e\nMon Nov 24 02:06:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed (19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e)\n19ab31c64beafed8b2dfaeff9d6b47ffb76330ece42e4669d6cff1a541f7de4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.861 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[b666f4dc-ca09-4b86-b4c9-ee66d8d6a17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.862 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0b13e60-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.863 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 kernel: tapc0b13e60-f0: left promiscuous mode
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.878 187003 INFO nova.compute.manager [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.878 187003 DEBUG oslo.service.loopingcall [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.879 187003 DEBUG nova.compute.manager [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.879 187003 DEBUG nova.network.neutron [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 02:06:26 compute-0 nova_compute[186999]: 2025-11-24 02:06:26.905 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.908 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[517e702c-277c-43ca-ae89-a1e318014a63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.923 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[596081a5-7c97-423e-8514-9063dad94abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.924 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3f965a-7931-48e0-a765-14b8de4fb9f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.942 213256 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ff232f-659a-46e9-9e12-b482e43f1181]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342673, 'reachable_time': 36840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219198, 'error': None, 'target': 'ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.945 104347 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c0b13e60-fb71-46a2-bea2-f40cc84e06ed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 02:06:26 compute-0 systemd[1]: run-netns-ovnmeta\x2dc0b13e60\x2dfb71\x2d46a2\x2dbea2\x2df40cc84e06ed.mount: Deactivated successfully.
Nov 24 02:06:26 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:26.945 104347 DEBUG oslo.privsep.daemon [-] privsep: reply[22f73cbf-1787-4b3c-874a-fa9e84a55a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.675 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.943 187003 DEBUG nova.network.neutron [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updated VIF entry in instance network info cache for port 03d8223f-1775-404e-821d-bb39489ef176. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.944 187003 DEBUG nova.network.neutron [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updating instance_info_cache with network_info: [{"id": "03d8223f-1775-404e-821d-bb39489ef176", "address": "fa:16:3e:6c:46:27", "network": {"id": "c0b13e60-fb71-46a2-bea2-f40cc84e06ed", "bridge": "br-int", "label": "tempest-network-smoke--569619060", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b76cc415e6cf44a196b0c059e9c5a880", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03d8223f-17", "ovs_interfaceid": "03d8223f-1775-404e-821d-bb39489ef176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.959 187003 DEBUG oslo_concurrency.lockutils [req-fb65de3a-61b8-4b8a-8858-02c173101363 req-79997ad1-f170-4832-8586-ce5356c707af 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Releasing lock "refresh_cache-ef286a52-dd45-4442-8b3e-46d42c8631b0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.971 187003 DEBUG nova.network.neutron [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 02:06:27 compute-0 nova_compute[186999]: 2025-11-24 02:06:27.983 187003 INFO nova.compute.manager [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Took 1.10 seconds to deallocate network for instance.
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.028 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.029 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.084 187003 DEBUG nova.compute.provider_tree [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.097 187003 DEBUG nova.scheduler.client.report [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.118 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.145 187003 INFO nova.scheduler.client.report [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Deleted allocations for instance ef286a52-dd45-4442-8b3e-46d42c8631b0
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.214 187003 DEBUG oslo_concurrency.lockutils [None req-5ba42c9e-704c-4d5e-b18b-0b9cc0ec7545 e6a149dd53b548a9bac30b99c4c1141f b76cc415e6cf44a196b0c059e9c5a880 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.511 187003 DEBUG nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-vif-unplugged-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.512 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.512 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.512 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.513 187003 DEBUG nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] No waiting events found dispatching network-vif-unplugged-03d8223f-1775-404e-821d-bb39489ef176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.513 187003 WARNING nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received unexpected event network-vif-unplugged-03d8223f-1775-404e-821d-bb39489ef176 for instance with vm_state deleted and task_state None.
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.514 187003 DEBUG nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.514 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Acquiring lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.514 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.515 187003 DEBUG oslo_concurrency.lockutils [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] Lock "ef286a52-dd45-4442-8b3e-46d42c8631b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.515 187003 DEBUG nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] No waiting events found dispatching network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.515 187003 WARNING nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received unexpected event network-vif-plugged-03d8223f-1775-404e-821d-bb39489ef176 for instance with vm_state deleted and task_state None.
Nov 24 02:06:28 compute-0 nova_compute[186999]: 2025-11-24 02:06:28.516 187003 DEBUG nova.compute.manager [req-0247a860-3fd8-4d0c-b08d-1406e365df2f req-b2c42826-d78f-44fd-81ff-72db5fc0dbfb 18a15360d10d45a78f36931c5bebc7ae bb33fa4729f8496d9802c7483162af05 - - default default] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Received event network-vif-deleted-03d8223f-1775-404e-821d-bb39489ef176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 02:06:29 compute-0 sshd-session[219199]: Invalid user mcserver from 154.90.59.75 port 34550
Nov 24 02:06:30 compute-0 sshd-session[219199]: Received disconnect from 154.90.59.75 port 34550:11: Bye Bye [preauth]
Nov 24 02:06:30 compute-0 sshd-session[219199]: Disconnected from invalid user mcserver 154.90.59.75 port 34550 [preauth]
Nov 24 02:06:31 compute-0 sshd-session[219107]: Connection closed by authenticating user root 68.210.96.117 port 55620 [preauth]
Nov 24 02:06:31 compute-0 nova_compute[186999]: 2025-11-24 02:06:31.805 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:32 compute-0 nova_compute[186999]: 2025-11-24 02:06:32.716 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:32 compute-0 podman[219201]: 2025-11-24 02:06:32.819756657 +0000 UTC m=+0.066663468 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 24 02:06:35 compute-0 nova_compute[186999]: 2025-11-24 02:06:35.116 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:35 compute-0 nova_compute[186999]: 2025-11-24 02:06:35.209 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:36 compute-0 nova_compute[186999]: 2025-11-24 02:06:36.808 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:37 compute-0 nova_compute[186999]: 2025-11-24 02:06:37.717 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:37 compute-0 podman[219222]: 2025-11-24 02:06:37.819735199 +0000 UTC m=+0.063898121 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, io.openshift.expose-services=)
Nov 24 02:06:41 compute-0 nova_compute[186999]: 2025-11-24 02:06:41.780 187003 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763949986.7785165, ef286a52-dd45-4442-8b3e-46d42c8631b0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 02:06:41 compute-0 nova_compute[186999]: 2025-11-24 02:06:41.780 187003 INFO nova.compute.manager [-] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] VM Stopped (Lifecycle Event)
Nov 24 02:06:41 compute-0 nova_compute[186999]: 2025-11-24 02:06:41.792 187003 DEBUG nova.compute.manager [None req-e3d21e32-6d80-4fac-aec0-99ecff87a7aa - - - - - -] [instance: ef286a52-dd45-4442-8b3e-46d42c8631b0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 02:06:41 compute-0 nova_compute[186999]: 2025-11-24 02:06:41.810 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:42 compute-0 nova_compute[186999]: 2025-11-24 02:06:42.758 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:42 compute-0 podman[219243]: 2025-11-24 02:06:42.845720382 +0000 UTC m=+0.060146386 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:06:45 compute-0 sshd-session[219266]: Invalid user bodega from 46.188.119.26 port 38452
Nov 24 02:06:45 compute-0 sshd-session[219266]: Received disconnect from 46.188.119.26 port 38452:11: Bye Bye [preauth]
Nov 24 02:06:45 compute-0 sshd-session[219266]: Disconnected from invalid user bodega 46.188.119.26 port 38452 [preauth]
Nov 24 02:06:46 compute-0 nova_compute[186999]: 2025-11-24 02:06:46.813 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:47 compute-0 nova_compute[186999]: 2025-11-24 02:06:47.761 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:47 compute-0 podman[219268]: 2025-11-24 02:06:47.812843742 +0000 UTC m=+0.058493690 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 02:06:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:48.427 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:06:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:06:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:06:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:06:51 compute-0 podman[219288]: 2025-11-24 02:06:51.812000235 +0000 UTC m=+0.047885093 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:06:51 compute-0 podman[219287]: 2025-11-24 02:06:51.816576823 +0000 UTC m=+0.056177355 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 02:06:51 compute-0 nova_compute[186999]: 2025-11-24 02:06:51.816 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:51 compute-0 podman[219289]: 2025-11-24 02:06:51.853753224 +0000 UTC m=+0.084436586 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 02:06:52 compute-0 nova_compute[186999]: 2025-11-24 02:06:52.762 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:56 compute-0 nova_compute[186999]: 2025-11-24 02:06:56.823 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:06:57 compute-0 nova_compute[186999]: 2025-11-24 02:06:57.764 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:01 compute-0 nova_compute[186999]: 2025-11-24 02:07:01.826 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:02 compute-0 nova_compute[186999]: 2025-11-24 02:07:02.766 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:03 compute-0 podman[219356]: 2025-11-24 02:07:03.807710834 +0000 UTC m=+0.059598911 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm)
Nov 24 02:07:04 compute-0 nova_compute[186999]: 2025-11-24 02:07:04.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:04 compute-0 nova_compute[186999]: 2025-11-24 02:07:04.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 02:07:06 compute-0 ovn_controller[95380]: 2025-11-24T02:07:06Z|00167|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Nov 24 02:07:06 compute-0 nova_compute[186999]: 2025-11-24 02:07:06.849 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:07 compute-0 nova_compute[186999]: 2025-11-24 02:07:07.769 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:08 compute-0 podman[219376]: 2025-11-24 02:07:08.825029673 +0000 UTC m=+0.071396341 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:07:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:07:11 compute-0 nova_compute[186999]: 2025-11-24 02:07:11.789 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:11 compute-0 nova_compute[186999]: 2025-11-24 02:07:11.852 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:12 compute-0 nova_compute[186999]: 2025-11-24 02:07:12.820 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:13 compute-0 nova_compute[186999]: 2025-11-24 02:07:13.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:13 compute-0 nova_compute[186999]: 2025-11-24 02:07:13.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:07:13 compute-0 nova_compute[186999]: 2025-11-24 02:07:13.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:07:13 compute-0 nova_compute[186999]: 2025-11-24 02:07:13.801 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:07:13 compute-0 podman[219395]: 2025-11-24 02:07:13.828907864 +0000 UTC m=+0.080594629 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:07:16 compute-0 nova_compute[186999]: 2025-11-24 02:07:16.854 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:17 compute-0 nova_compute[186999]: 2025-11-24 02:07:17.820 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.804 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.805 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:07:18 compute-0 podman[219419]: 2025-11-24 02:07:18.840399812 +0000 UTC m=+0.080219948 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.968 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.969 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5762MB free_disk=73.45561599731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.970 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:07:18 compute-0 nova_compute[186999]: 2025-11-24 02:07:18.970 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.244 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.244 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.325 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing inventories for resource provider f28f14d1-2972-450a-b67e-0899e7918234 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.431 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating ProviderTree inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.432 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.456 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing aggregate associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.478 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing trait associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, traits: COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AESNI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,HW_CPU_X86_ABM,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.501 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.514 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.541 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:07:19 compute-0 nova_compute[186999]: 2025-11-24 02:07:19.542 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:07:20 compute-0 nova_compute[186999]: 2025-11-24 02:07:20.537 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:20 compute-0 nova_compute[186999]: 2025-11-24 02:07:20.538 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:20 compute-0 nova_compute[186999]: 2025-11-24 02:07:20.538 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:20 compute-0 nova_compute[186999]: 2025-11-24 02:07:20.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:21 compute-0 nova_compute[186999]: 2025-11-24 02:07:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:21 compute-0 nova_compute[186999]: 2025-11-24 02:07:21.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 02:07:21 compute-0 nova_compute[186999]: 2025-11-24 02:07:21.786 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 02:07:21 compute-0 nova_compute[186999]: 2025-11-24 02:07:21.857 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:22 compute-0 nova_compute[186999]: 2025-11-24 02:07:22.787 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:22 compute-0 nova_compute[186999]: 2025-11-24 02:07:22.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:07:22 compute-0 nova_compute[186999]: 2025-11-24 02:07:22.823 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:22 compute-0 podman[219440]: 2025-11-24 02:07:22.834763088 +0000 UTC m=+0.069835707 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:07:22 compute-0 podman[219441]: 2025-11-24 02:07:22.857564027 +0000 UTC m=+0.098506361 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 24 02:07:22 compute-0 podman[219439]: 2025-11-24 02:07:22.864841121 +0000 UTC m=+0.104903970 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 02:07:23 compute-0 nova_compute[186999]: 2025-11-24 02:07:23.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:07:24 compute-0 sshd-session[219505]: Accepted publickey for zuul from 192.168.122.10 port 57782 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 02:07:24 compute-0 systemd-logind[791]: New session 26 of user zuul.
Nov 24 02:07:24 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 24 02:07:24 compute-0 sshd-session[219505]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 02:07:24 compute-0 sudo[219509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 02:07:24 compute-0 sudo[219509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 02:07:26 compute-0 nova_compute[186999]: 2025-11-24 02:07:26.860 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:27 compute-0 nova_compute[186999]: 2025-11-24 02:07:27.824 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:29 compute-0 ovs-vsctl[219679]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 02:07:29 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 219533 (sos)
Nov 24 02:07:29 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 24 02:07:29 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 24 02:07:30 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 02:07:30 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 02:07:30 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 02:07:31 compute-0 crontab[220077]: (root) LIST (root)
Nov 24 02:07:31 compute-0 nova_compute[186999]: 2025-11-24 02:07:31.862 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:32 compute-0 nova_compute[186999]: 2025-11-24 02:07:32.858 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:33 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 02:07:33 compute-0 systemd[1]: Started Hostname Service.
Nov 24 02:07:34 compute-0 podman[220198]: 2025-11-24 02:07:34.016219295 +0000 UTC m=+0.096668545 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 02:07:36 compute-0 nova_compute[186999]: 2025-11-24 02:07:36.865 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:37 compute-0 nova_compute[186999]: 2025-11-24 02:07:37.860 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:39 compute-0 podman[220886]: 2025-11-24 02:07:39.617267152 +0000 UTC m=+0.070338496 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 24 02:07:40 compute-0 ovs-appctl[221305]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 02:07:40 compute-0 ovs-appctl[221309]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 02:07:40 compute-0 ovs-appctl[221315]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 02:07:41 compute-0 nova_compute[186999]: 2025-11-24 02:07:41.868 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:42 compute-0 nova_compute[186999]: 2025-11-24 02:07:42.863 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:43 compute-0 podman[222266]: 2025-11-24 02:07:43.991276683 +0000 UTC m=+0.083516517 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:07:46 compute-0 nova_compute[186999]: 2025-11-24 02:07:46.874 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:47 compute-0 nova_compute[186999]: 2025-11-24 02:07:47.865 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:48 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 02:07:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:07:48.428 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:07:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:07:48.429 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:07:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:07:48.429 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:07:48 compute-0 podman[222736]: 2025-11-24 02:07:48.970829662 +0000 UTC m=+0.078185967 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 24 02:07:49 compute-0 systemd[1]: Starting Time & Date Service...
Nov 24 02:07:49 compute-0 systemd[1]: Started Time & Date Service.
Nov 24 02:07:49 compute-0 sshd-session[222705]: Invalid user marco from 154.90.59.75 port 59422
Nov 24 02:07:50 compute-0 sshd-session[222705]: Received disconnect from 154.90.59.75 port 59422:11: Bye Bye [preauth]
Nov 24 02:07:50 compute-0 sshd-session[222705]: Disconnected from invalid user marco 154.90.59.75 port 59422 [preauth]
Nov 24 02:07:51 compute-0 nova_compute[186999]: 2025-11-24 02:07:51.877 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:52 compute-0 nova_compute[186999]: 2025-11-24 02:07:52.930 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:53 compute-0 podman[222867]: 2025-11-24 02:07:53.816987574 +0000 UTC m=+0.061064396 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:07:53 compute-0 podman[222866]: 2025-11-24 02:07:53.817057146 +0000 UTC m=+0.061247041 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 02:07:53 compute-0 podman[222868]: 2025-11-24 02:07:53.847922373 +0000 UTC m=+0.085355978 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 02:07:56 compute-0 nova_compute[186999]: 2025-11-24 02:07:56.883 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:07:57 compute-0 sshd-session[222931]: Received disconnect from 46.188.119.26 port 38778:11: Bye Bye [preauth]
Nov 24 02:07:57 compute-0 sshd-session[222931]: Disconnected from authenticating user root 46.188.119.26 port 38778 [preauth]
Nov 24 02:07:57 compute-0 nova_compute[186999]: 2025-11-24 02:07:57.931 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:01 compute-0 nova_compute[186999]: 2025-11-24 02:08:01.887 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:02 compute-0 nova_compute[186999]: 2025-11-24 02:08:02.934 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:04 compute-0 podman[222933]: 2025-11-24 02:08:04.17685373 +0000 UTC m=+0.070784399 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:08:06 compute-0 nova_compute[186999]: 2025-11-24 02:08:06.889 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:07 compute-0 nova_compute[186999]: 2025-11-24 02:08:07.941 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:09 compute-0 podman[222953]: 2025-11-24 02:08:09.842820089 +0000 UTC m=+0.080261554 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Nov 24 02:08:11 compute-0 sudo[219509]: pam_unix(sudo:session): session closed for user root
Nov 24 02:08:11 compute-0 sshd-session[219508]: Received disconnect from 192.168.122.10 port 57782:11: disconnected by user
Nov 24 02:08:11 compute-0 sshd-session[219508]: Disconnected from user zuul 192.168.122.10 port 57782
Nov 24 02:08:11 compute-0 sshd-session[219505]: pam_unix(sshd:session): session closed for user zuul
Nov 24 02:08:11 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 24 02:08:11 compute-0 systemd[1]: session-26.scope: Consumed 1min 16.651s CPU time, 498.1M memory peak, read 101.2M from disk, written 38.5M to disk.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Session 26 logged out. Waiting for processes to exit.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Removed session 26.
Nov 24 02:08:11 compute-0 sshd-session[222974]: Accepted publickey for zuul from 192.168.122.10 port 54038 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 02:08:11 compute-0 systemd-logind[791]: New session 27 of user zuul.
Nov 24 02:08:11 compute-0 systemd[1]: Started Session 27 of User zuul.
Nov 24 02:08:11 compute-0 sshd-session[222974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 02:08:11 compute-0 sudo[222978]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-11-24-gssypjp.tar.xz
Nov 24 02:08:11 compute-0 sudo[222978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 02:08:11 compute-0 sudo[222978]: pam_unix(sudo:session): session closed for user root
Nov 24 02:08:11 compute-0 sshd-session[222977]: Received disconnect from 192.168.122.10 port 54038:11: disconnected by user
Nov 24 02:08:11 compute-0 sshd-session[222977]: Disconnected from user zuul 192.168.122.10 port 54038
Nov 24 02:08:11 compute-0 sshd-session[222974]: pam_unix(sshd:session): session closed for user zuul
Nov 24 02:08:11 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Session 27 logged out. Waiting for processes to exit.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Removed session 27.
Nov 24 02:08:11 compute-0 sshd-session[223003]: Accepted publickey for zuul from 192.168.122.10 port 54046 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 02:08:11 compute-0 systemd-logind[791]: New session 28 of user zuul.
Nov 24 02:08:11 compute-0 systemd[1]: Started Session 28 of User zuul.
Nov 24 02:08:11 compute-0 sshd-session[223003]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 02:08:11 compute-0 nova_compute[186999]: 2025-11-24 02:08:11.802 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:11 compute-0 sudo[223007]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 24 02:08:11 compute-0 sudo[223007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 02:08:11 compute-0 sudo[223007]: pam_unix(sudo:session): session closed for user root
Nov 24 02:08:11 compute-0 sshd-session[223006]: Received disconnect from 192.168.122.10 port 54046:11: disconnected by user
Nov 24 02:08:11 compute-0 sshd-session[223006]: Disconnected from user zuul 192.168.122.10 port 54046
Nov 24 02:08:11 compute-0 sshd-session[223003]: pam_unix(sshd:session): session closed for user zuul
Nov 24 02:08:11 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Session 28 logged out. Waiting for processes to exit.
Nov 24 02:08:11 compute-0 systemd-logind[791]: Removed session 28.
Nov 24 02:08:11 compute-0 nova_compute[186999]: 2025-11-24 02:08:11.894 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:12 compute-0 nova_compute[186999]: 2025-11-24 02:08:12.948 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:14 compute-0 podman[223032]: 2025-11-24 02:08:14.806267287 +0000 UTC m=+0.057062514 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:08:15 compute-0 nova_compute[186999]: 2025-11-24 02:08:15.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:15 compute-0 nova_compute[186999]: 2025-11-24 02:08:15.774 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:08:15 compute-0 nova_compute[186999]: 2025-11-24 02:08:15.774 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:08:15 compute-0 nova_compute[186999]: 2025-11-24 02:08:15.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:08:16 compute-0 nova_compute[186999]: 2025-11-24 02:08:16.780 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:16 compute-0 nova_compute[186999]: 2025-11-24 02:08:16.897 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:17 compute-0 nova_compute[186999]: 2025-11-24 02:08:17.950 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:18 compute-0 nova_compute[186999]: 2025-11-24 02:08:18.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:19 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 02:08:19 compute-0 podman[223058]: 2025-11-24 02:08:19.800790178 +0000 UTC m=+0.052170457 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.806 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:08:19 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.981 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.983 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5588MB free_disk=73.45514297485352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.983 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:08:19 compute-0 nova_compute[186999]: 2025-11-24 02:08:19.984 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.039 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.040 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.067 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.078 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.080 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:08:20 compute-0 nova_compute[186999]: 2025-11-24 02:08:20.080 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:08:21 compute-0 nova_compute[186999]: 2025-11-24 02:08:21.075 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:21 compute-0 nova_compute[186999]: 2025-11-24 02:08:21.076 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:21 compute-0 nova_compute[186999]: 2025-11-24 02:08:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:21 compute-0 nova_compute[186999]: 2025-11-24 02:08:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:21 compute-0 nova_compute[186999]: 2025-11-24 02:08:21.900 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:22 compute-0 nova_compute[186999]: 2025-11-24 02:08:22.953 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:24 compute-0 nova_compute[186999]: 2025-11-24 02:08:24.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:08:24 compute-0 nova_compute[186999]: 2025-11-24 02:08:24.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:08:24 compute-0 podman[223083]: 2025-11-24 02:08:24.803826706 +0000 UTC m=+0.056088777 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:08:24 compute-0 podman[223082]: 2025-11-24 02:08:24.804359141 +0000 UTC m=+0.060642815 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:08:24 compute-0 podman[223084]: 2025-11-24 02:08:24.86840485 +0000 UTC m=+0.116848083 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:08:26 compute-0 nova_compute[186999]: 2025-11-24 02:08:26.903 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:27 compute-0 nova_compute[186999]: 2025-11-24 02:08:27.955 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:31 compute-0 nova_compute[186999]: 2025-11-24 02:08:31.906 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:32 compute-0 nova_compute[186999]: 2025-11-24 02:08:32.956 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:34 compute-0 podman[223151]: 2025-11-24 02:08:34.836164332 +0000 UTC m=+0.085907733 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:08:36 compute-0 nova_compute[186999]: 2025-11-24 02:08:36.909 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:37 compute-0 nova_compute[186999]: 2025-11-24 02:08:37.958 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:40 compute-0 podman[223172]: 2025-11-24 02:08:40.800410941 +0000 UTC m=+0.056453636 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, vendor=Red Hat, Inc.)
Nov 24 02:08:41 compute-0 nova_compute[186999]: 2025-11-24 02:08:41.912 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:42 compute-0 nova_compute[186999]: 2025-11-24 02:08:42.960 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:45 compute-0 podman[223193]: 2025-11-24 02:08:45.798279945 +0000 UTC m=+0.055430028 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 02:08:46 compute-0 nova_compute[186999]: 2025-11-24 02:08:46.915 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:47 compute-0 nova_compute[186999]: 2025-11-24 02:08:47.961 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:08:48.429 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:08:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:08:48.430 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:08:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:08:48.430 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:08:50 compute-0 podman[223217]: 2025-11-24 02:08:50.792397513 +0000 UTC m=+0.049117080 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 24 02:08:51 compute-0 nova_compute[186999]: 2025-11-24 02:08:51.919 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:52 compute-0 nova_compute[186999]: 2025-11-24 02:08:52.993 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:55 compute-0 podman[223238]: 2025-11-24 02:08:55.806187556 +0000 UTC m=+0.059707839 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 02:08:55 compute-0 podman[223237]: 2025-11-24 02:08:55.825043675 +0000 UTC m=+0.081393507 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 02:08:55 compute-0 podman[223239]: 2025-11-24 02:08:55.850208762 +0000 UTC m=+0.100281788 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 24 02:08:56 compute-0 nova_compute[186999]: 2025-11-24 02:08:56.922 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:08:57 compute-0 nova_compute[186999]: 2025-11-24 02:08:57.996 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:01 compute-0 nova_compute[186999]: 2025-11-24 02:09:01.926 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:02 compute-0 nova_compute[186999]: 2025-11-24 02:09:02.998 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:03 compute-0 sshd-session[223304]: Received disconnect from 193.46.255.7 port 55264:11:  [preauth]
Nov 24 02:09:03 compute-0 sshd-session[223304]: Disconnected from authenticating user root 193.46.255.7 port 55264 [preauth]
Nov 24 02:09:05 compute-0 podman[223306]: 2025-11-24 02:09:05.826292217 +0000 UTC m=+0.074917955 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 02:09:06 compute-0 nova_compute[186999]: 2025-11-24 02:09:06.935 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:08 compute-0 nova_compute[186999]: 2025-11-24 02:09:08.000 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:08 compute-0 sshd-session[223326]: Invalid user gits from 154.90.59.75 port 34598
Nov 24 02:09:08 compute-0 sshd-session[223326]: Received disconnect from 154.90.59.75 port 34598:11: Bye Bye [preauth]
Nov 24 02:09:08 compute-0 sshd-session[223326]: Disconnected from invalid user gits 154.90.59.75 port 34598 [preauth]
Nov 24 02:09:10 compute-0 sshd-session[223328]: Invalid user devops from 46.188.119.26 port 39106
Nov 24 02:09:10 compute-0 sshd-session[223328]: Received disconnect from 46.188.119.26 port 39106:11: Bye Bye [preauth]
Nov 24 02:09:10 compute-0 sshd-session[223328]: Disconnected from invalid user devops 46.188.119.26 port 39106 [preauth]
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:09:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:09:11 compute-0 podman[223330]: 2025-11-24 02:09:11.820649631 +0000 UTC m=+0.072897269 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 24 02:09:11 compute-0 nova_compute[186999]: 2025-11-24 02:09:11.938 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:12 compute-0 nova_compute[186999]: 2025-11-24 02:09:12.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:13 compute-0 nova_compute[186999]: 2025-11-24 02:09:13.003 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:16 compute-0 podman[223349]: 2025-11-24 02:09:16.792586345 +0000 UTC m=+0.049359707 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:09:16 compute-0 nova_compute[186999]: 2025-11-24 02:09:16.988 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:17 compute-0 nova_compute[186999]: 2025-11-24 02:09:17.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:17 compute-0 nova_compute[186999]: 2025-11-24 02:09:17.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:09:17 compute-0 nova_compute[186999]: 2025-11-24 02:09:17.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:09:17 compute-0 nova_compute[186999]: 2025-11-24 02:09:17.792 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:09:18 compute-0 nova_compute[186999]: 2025-11-24 02:09:18.038 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:19 compute-0 nova_compute[186999]: 2025-11-24 02:09:19.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:19 compute-0 nova_compute[186999]: 2025-11-24 02:09:19.997 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:09:19 compute-0 nova_compute[186999]: 2025-11-24 02:09:19.997 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:09:19 compute-0 nova_compute[186999]: 2025-11-24 02:09:19.997 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:09:19 compute-0 nova_compute[186999]: 2025-11-24 02:09:19.997 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.141 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.142 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.45529174804688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.143 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.143 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.199 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.199 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.233 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.243 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.244 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:09:20 compute-0 nova_compute[186999]: 2025-11-24 02:09:20.245 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:09:21 compute-0 nova_compute[186999]: 2025-11-24 02:09:21.244 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:21 compute-0 nova_compute[186999]: 2025-11-24 02:09:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:21 compute-0 nova_compute[186999]: 2025-11-24 02:09:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:21 compute-0 nova_compute[186999]: 2025-11-24 02:09:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:21 compute-0 podman[223373]: 2025-11-24 02:09:21.790739685 +0000 UTC m=+0.045470328 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 02:09:21 compute-0 nova_compute[186999]: 2025-11-24 02:09:21.991 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:22 compute-0 nova_compute[186999]: 2025-11-24 02:09:22.766 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:23 compute-0 nova_compute[186999]: 2025-11-24 02:09:23.065 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:26 compute-0 nova_compute[186999]: 2025-11-24 02:09:26.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:09:26 compute-0 nova_compute[186999]: 2025-11-24 02:09:26.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:09:26 compute-0 podman[223392]: 2025-11-24 02:09:26.847830323 +0000 UTC m=+0.090650008 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 02:09:26 compute-0 podman[223393]: 2025-11-24 02:09:26.856850216 +0000 UTC m=+0.096619425 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:09:26 compute-0 podman[223394]: 2025-11-24 02:09:26.88618619 +0000 UTC m=+0.113576411 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 02:09:26 compute-0 nova_compute[186999]: 2025-11-24 02:09:26.993 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:28 compute-0 nova_compute[186999]: 2025-11-24 02:09:28.067 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:32 compute-0 nova_compute[186999]: 2025-11-24 02:09:32.029 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:33 compute-0 nova_compute[186999]: 2025-11-24 02:09:33.114 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:36 compute-0 podman[223458]: 2025-11-24 02:09:36.814175038 +0000 UTC m=+0.062847395 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 02:09:37 compute-0 nova_compute[186999]: 2025-11-24 02:09:37.033 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:38 compute-0 nova_compute[186999]: 2025-11-24 02:09:38.161 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:42 compute-0 nova_compute[186999]: 2025-11-24 02:09:42.035 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:42 compute-0 podman[223478]: 2025-11-24 02:09:42.799855768 +0000 UTC m=+0.054111498 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 24 02:09:43 compute-0 nova_compute[186999]: 2025-11-24 02:09:43.164 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:47 compute-0 nova_compute[186999]: 2025-11-24 02:09:47.037 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:47 compute-0 podman[223500]: 2025-11-24 02:09:47.810136646 +0000 UTC m=+0.060688213 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:09:48 compute-0 nova_compute[186999]: 2025-11-24 02:09:48.203 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:09:48.430 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:09:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:09:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:09:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:09:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:09:52 compute-0 nova_compute[186999]: 2025-11-24 02:09:52.040 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:52 compute-0 podman[223524]: 2025-11-24 02:09:52.793950487 +0000 UTC m=+0.050719452 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 02:09:53 compute-0 nova_compute[186999]: 2025-11-24 02:09:53.204 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:57 compute-0 nova_compute[186999]: 2025-11-24 02:09:57.042 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:09:57 compute-0 podman[223543]: 2025-11-24 02:09:57.803516294 +0000 UTC m=+0.050557317 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 24 02:09:57 compute-0 podman[223544]: 2025-11-24 02:09:57.807505267 +0000 UTC m=+0.045988309 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 02:09:57 compute-0 podman[223545]: 2025-11-24 02:09:57.836088733 +0000 UTC m=+0.072088305 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:09:58 compute-0 nova_compute[186999]: 2025-11-24 02:09:58.206 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:02 compute-0 nova_compute[186999]: 2025-11-24 02:10:02.044 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:03 compute-0 nova_compute[186999]: 2025-11-24 02:10:03.208 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:07 compute-0 nova_compute[186999]: 2025-11-24 02:10:07.046 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:07 compute-0 podman[223612]: 2025-11-24 02:10:07.819004778 +0000 UTC m=+0.064307356 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm)
Nov 24 02:10:08 compute-0 nova_compute[186999]: 2025-11-24 02:10:08.214 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:12 compute-0 nova_compute[186999]: 2025-11-24 02:10:12.049 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:13 compute-0 nova_compute[186999]: 2025-11-24 02:10:13.217 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:13 compute-0 podman[223634]: 2025-11-24 02:10:13.801937519 +0000 UTC m=+0.058673196 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 02:10:14 compute-0 nova_compute[186999]: 2025-11-24 02:10:14.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:15 compute-0 sshd-session[223655]: Invalid user test from 80.94.95.115 port 48866
Nov 24 02:10:16 compute-0 sshd-session[223655]: Connection closed by invalid user test 80.94.95.115 port 48866 [preauth]
Nov 24 02:10:17 compute-0 nova_compute[186999]: 2025-11-24 02:10:17.052 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.219 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:18 compute-0 sshd-session[223632]: Connection closed by authenticating user root 68.210.96.117 port 35740 [preauth]
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.780 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.780 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.780 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:10:18 compute-0 nova_compute[186999]: 2025-11-24 02:10:18.788 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:10:18 compute-0 podman[223657]: 2025-11-24 02:10:18.798830289 +0000 UTC m=+0.049382495 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:10:20 compute-0 nova_compute[186999]: 2025-11-24 02:10:20.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:21 compute-0 nova_compute[186999]: 2025-11-24 02:10:21.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:21 compute-0 nova_compute[186999]: 2025-11-24 02:10:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.055 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.135 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.135 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.135 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.136 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.256 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.257 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.45538330078125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.257 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.257 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.302 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.303 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.320 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.332 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.334 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:10:22 compute-0 nova_compute[186999]: 2025-11-24 02:10:22.334 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:10:23 compute-0 nova_compute[186999]: 2025-11-24 02:10:23.221 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:23 compute-0 nova_compute[186999]: 2025-11-24 02:10:23.330 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:23 compute-0 nova_compute[186999]: 2025-11-24 02:10:23.331 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:23 compute-0 nova_compute[186999]: 2025-11-24 02:10:23.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:23 compute-0 podman[223681]: 2025-11-24 02:10:23.795787241 +0000 UTC m=+0.047938214 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 02:10:27 compute-0 nova_compute[186999]: 2025-11-24 02:10:27.058 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:27 compute-0 sshd-session[223701]: Received disconnect from 154.90.59.75 port 40116:11: Bye Bye [preauth]
Nov 24 02:10:27 compute-0 sshd-session[223701]: Disconnected from authenticating user root 154.90.59.75 port 40116 [preauth]
Nov 24 02:10:27 compute-0 nova_compute[186999]: 2025-11-24 02:10:27.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:10:27 compute-0 nova_compute[186999]: 2025-11-24 02:10:27.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:10:28 compute-0 nova_compute[186999]: 2025-11-24 02:10:28.224 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:28 compute-0 podman[223704]: 2025-11-24 02:10:28.807939382 +0000 UTC m=+0.053395528 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:10:28 compute-0 podman[223703]: 2025-11-24 02:10:28.814806625 +0000 UTC m=+0.065039356 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 24 02:10:28 compute-0 podman[223705]: 2025-11-24 02:10:28.850727569 +0000 UTC m=+0.091064961 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 02:10:32 compute-0 nova_compute[186999]: 2025-11-24 02:10:32.062 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:33 compute-0 nova_compute[186999]: 2025-11-24 02:10:33.273 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:37 compute-0 nova_compute[186999]: 2025-11-24 02:10:37.065 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:38 compute-0 nova_compute[186999]: 2025-11-24 02:10:38.273 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:38 compute-0 podman[223773]: 2025-11-24 02:10:38.80519153 +0000 UTC m=+0.058017798 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:10:42 compute-0 nova_compute[186999]: 2025-11-24 02:10:42.066 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:43 compute-0 nova_compute[186999]: 2025-11-24 02:10:43.326 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:44 compute-0 podman[223794]: 2025-11-24 02:10:44.807484828 +0000 UTC m=+0.062706230 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6)
Nov 24 02:10:47 compute-0 nova_compute[186999]: 2025-11-24 02:10:47.069 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:48 compute-0 nova_compute[186999]: 2025-11-24 02:10:48.327 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:48.430 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:10:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:10:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:10:49 compute-0 podman[223816]: 2025-11-24 02:10:49.795604691 +0000 UTC m=+0.049009394 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:10:51 compute-0 nova_compute[186999]: 2025-11-24 02:10:51.117 187003 DEBUG oslo_concurrency.processutils [None req-b2457670-321c-46a6-806d-3356b98e7c74 4b13688172504c329b95a3f103b5ec01 f637cef21e80464abf1687ea895028cc - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 02:10:51 compute-0 nova_compute[186999]: 2025-11-24 02:10:51.149 187003 DEBUG oslo_concurrency.processutils [None req-b2457670-321c-46a6-806d-3356b98e7c74 4b13688172504c329b95a3f103b5ec01 f637cef21e80464abf1687ea895028cc - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 02:10:52 compute-0 nova_compute[186999]: 2025-11-24 02:10:52.072 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:53 compute-0 nova_compute[186999]: 2025-11-24 02:10:53.329 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:54 compute-0 podman[223842]: 2025-11-24 02:10:54.795773124 +0000 UTC m=+0.049643203 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 02:10:57 compute-0 nova_compute[186999]: 2025-11-24 02:10:57.075 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:57.991 104238 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:91:11', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3a:b5:c9:fe:8c:90'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 02:10:57 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:57.992 104238 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 02:10:57 compute-0 nova_compute[186999]: 2025-11-24 02:10:57.992 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:58 compute-0 nova_compute[186999]: 2025-11-24 02:10:58.331 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:10:58 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:10:58.994 104238 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8ad7b7b-7799-4041-b082-e8facd56e34a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 02:10:59 compute-0 podman[223862]: 2025-11-24 02:10:59.807374739 +0000 UTC m=+0.058842561 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 02:10:59 compute-0 podman[223863]: 2025-11-24 02:10:59.83680435 +0000 UTC m=+0.082361135 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:10:59 compute-0 podman[223864]: 2025-11-24 02:10:59.83682688 +0000 UTC m=+0.081558502 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 02:11:02 compute-0 nova_compute[186999]: 2025-11-24 02:11:02.078 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:03 compute-0 nova_compute[186999]: 2025-11-24 02:11:03.335 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:07 compute-0 nova_compute[186999]: 2025-11-24 02:11:07.100 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:08 compute-0 nova_compute[186999]: 2025-11-24 02:11:08.338 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:09 compute-0 podman[223930]: 2025-11-24 02:11:09.814296069 +0000 UTC m=+0.066856517 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:11:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:11:12 compute-0 nova_compute[186999]: 2025-11-24 02:11:12.102 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:13 compute-0 nova_compute[186999]: 2025-11-24 02:11:13.340 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:15 compute-0 podman[223950]: 2025-11-24 02:11:15.80212254 +0000 UTC m=+0.055444786 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Nov 24 02:11:16 compute-0 nova_compute[186999]: 2025-11-24 02:11:16.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:17 compute-0 nova_compute[186999]: 2025-11-24 02:11:17.104 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:18 compute-0 nova_compute[186999]: 2025-11-24 02:11:18.342 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:20 compute-0 nova_compute[186999]: 2025-11-24 02:11:20.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:20 compute-0 nova_compute[186999]: 2025-11-24 02:11:20.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:11:20 compute-0 nova_compute[186999]: 2025-11-24 02:11:20.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:11:20 compute-0 nova_compute[186999]: 2025-11-24 02:11:20.786 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:11:20 compute-0 podman[223971]: 2025-11-24 02:11:20.80082608 +0000 UTC m=+0.053670155 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.106 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.795 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.796 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.797 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.950 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.951 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.45108032226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.951 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:11:22 compute-0 nova_compute[186999]: 2025-11-24 02:11:22.951 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.019 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.019 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.041 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.051 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.052 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.052 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:11:23 compute-0 nova_compute[186999]: 2025-11-24 02:11:23.345 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:24 compute-0 nova_compute[186999]: 2025-11-24 02:11:24.049 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:25 compute-0 nova_compute[186999]: 2025-11-24 02:11:25.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:25 compute-0 podman[223996]: 2025-11-24 02:11:25.816508811 +0000 UTC m=+0.070858771 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 02:11:27 compute-0 nova_compute[186999]: 2025-11-24 02:11:27.108 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:28 compute-0 nova_compute[186999]: 2025-11-24 02:11:28.347 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:29 compute-0 nova_compute[186999]: 2025-11-24 02:11:29.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:11:29 compute-0 nova_compute[186999]: 2025-11-24 02:11:29.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:11:30 compute-0 podman[224015]: 2025-11-24 02:11:30.812731051 +0000 UTC m=+0.066791656 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:11:30 compute-0 podman[224016]: 2025-11-24 02:11:30.815692114 +0000 UTC m=+0.061359632 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:11:30 compute-0 podman[224017]: 2025-11-24 02:11:30.84318546 +0000 UTC m=+0.089929258 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 02:11:32 compute-0 nova_compute[186999]: 2025-11-24 02:11:32.111 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:33 compute-0 nova_compute[186999]: 2025-11-24 02:11:33.349 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:37 compute-0 nova_compute[186999]: 2025-11-24 02:11:37.113 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:38 compute-0 nova_compute[186999]: 2025-11-24 02:11:38.351 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:40 compute-0 podman[224085]: 2025-11-24 02:11:40.805614874 +0000 UTC m=+0.058803370 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:11:42 compute-0 nova_compute[186999]: 2025-11-24 02:11:42.115 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:43 compute-0 nova_compute[186999]: 2025-11-24 02:11:43.353 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:46 compute-0 sshd-session[224105]: Received disconnect from 154.90.59.75 port 45516:11: Bye Bye [preauth]
Nov 24 02:11:46 compute-0 sshd-session[224105]: Disconnected from authenticating user root 154.90.59.75 port 45516 [preauth]
Nov 24 02:11:46 compute-0 podman[224107]: 2025-11-24 02:11:46.800871295 +0000 UTC m=+0.061330152 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 02:11:47 compute-0 nova_compute[186999]: 2025-11-24 02:11:47.117 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:48 compute-0 nova_compute[186999]: 2025-11-24 02:11:48.354 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:11:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:11:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:11:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:11:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:11:48.431 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:11:51 compute-0 podman[224128]: 2025-11-24 02:11:51.801012837 +0000 UTC m=+0.052653100 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:11:52 compute-0 nova_compute[186999]: 2025-11-24 02:11:52.120 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:53 compute-0 nova_compute[186999]: 2025-11-24 02:11:53.356 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:56 compute-0 podman[224153]: 2025-11-24 02:11:56.805144586 +0000 UTC m=+0.054719039 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:11:57 compute-0 nova_compute[186999]: 2025-11-24 02:11:57.164 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:11:58 compute-0 nova_compute[186999]: 2025-11-24 02:11:58.387 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:01 compute-0 podman[224172]: 2025-11-24 02:12:01.809704287 +0000 UTC m=+0.062034412 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:12:01 compute-0 podman[224173]: 2025-11-24 02:12:01.832583951 +0000 UTC m=+0.082400633 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:12:01 compute-0 podman[224174]: 2025-11-24 02:12:01.845776684 +0000 UTC m=+0.089409956 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:12:02 compute-0 nova_compute[186999]: 2025-11-24 02:12:02.205 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:03 compute-0 nova_compute[186999]: 2025-11-24 02:12:03.441 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:06 compute-0 nova_compute[186999]: 2025-11-24 02:12:06.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:06 compute-0 nova_compute[186999]: 2025-11-24 02:12:06.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 02:12:07 compute-0 nova_compute[186999]: 2025-11-24 02:12:07.207 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:08 compute-0 nova_compute[186999]: 2025-11-24 02:12:08.443 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:11 compute-0 podman[224238]: 2025-11-24 02:12:11.795975307 +0000 UTC m=+0.052721232 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:12:12 compute-0 nova_compute[186999]: 2025-11-24 02:12:12.210 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:13 compute-0 nova_compute[186999]: 2025-11-24 02:12:13.445 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:17 compute-0 nova_compute[186999]: 2025-11-24 02:12:17.212 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:17 compute-0 nova_compute[186999]: 2025-11-24 02:12:17.785 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:17 compute-0 podman[224258]: 2025-11-24 02:12:17.818620602 +0000 UTC m=+0.061086294 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 24 02:12:18 compute-0 nova_compute[186999]: 2025-11-24 02:12:18.448 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.784 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.784 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.785 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 02:12:21 compute-0 nova_compute[186999]: 2025-11-24 02:12:21.796 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 02:12:22 compute-0 nova_compute[186999]: 2025-11-24 02:12:22.213 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:22 compute-0 nova_compute[186999]: 2025-11-24 02:12:22.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:22 compute-0 nova_compute[186999]: 2025-11-24 02:12:22.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:22 compute-0 nova_compute[186999]: 2025-11-24 02:12:22.783 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:22 compute-0 podman[224282]: 2025-11-24 02:12:22.815614605 +0000 UTC m=+0.066939305 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 02:12:23 compute-0 nova_compute[186999]: 2025-11-24 02:12:23.449 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:23 compute-0 nova_compute[186999]: 2025-11-24 02:12:23.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.696 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.792 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.793 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.793 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.794 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.929 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.930 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.45108032226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.931 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:12:24 compute-0 nova_compute[186999]: 2025-11-24 02:12:24.931 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.040 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.040 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.094 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing inventories for resource provider f28f14d1-2972-450a-b67e-0899e7918234 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.149 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating ProviderTree inventory for provider f28f14d1-2972-450a-b67e-0899e7918234 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.150 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Updating inventory in ProviderTree for provider f28f14d1-2972-450a-b67e-0899e7918234 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.161 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing aggregate associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.182 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Refreshing trait associations for resource provider f28f14d1-2972-450a-b67e-0899e7918234, traits: COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AESNI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,HW_CPU_X86_ABM,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.201 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.213 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.214 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:12:25 compute-0 nova_compute[186999]: 2025-11-24 02:12:25.215 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:12:27 compute-0 nova_compute[186999]: 2025-11-24 02:12:27.216 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:27 compute-0 podman[224307]: 2025-11-24 02:12:27.794865201 +0000 UTC m=+0.049037904 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:12:28 compute-0 nova_compute[186999]: 2025-11-24 02:12:28.215 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:28 compute-0 nova_compute[186999]: 2025-11-24 02:12:28.451 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:29 compute-0 nova_compute[186999]: 2025-11-24 02:12:29.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:30 compute-0 nova_compute[186999]: 2025-11-24 02:12:30.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:12:30 compute-0 nova_compute[186999]: 2025-11-24 02:12:30.783 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:12:32 compute-0 nova_compute[186999]: 2025-11-24 02:12:32.218 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:32 compute-0 podman[224328]: 2025-11-24 02:12:32.81070049 +0000 UTC m=+0.057077758 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 02:12:32 compute-0 podman[224329]: 2025-11-24 02:12:32.83068097 +0000 UTC m=+0.076473771 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:12:32 compute-0 podman[224330]: 2025-11-24 02:12:32.884654977 +0000 UTC m=+0.125487234 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 24 02:12:33 compute-0 nova_compute[186999]: 2025-11-24 02:12:33.453 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:37 compute-0 nova_compute[186999]: 2025-11-24 02:12:37.233 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:38 compute-0 nova_compute[186999]: 2025-11-24 02:12:38.459 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:42 compute-0 nova_compute[186999]: 2025-11-24 02:12:42.275 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:42 compute-0 podman[224394]: 2025-11-24 02:12:42.815615422 +0000 UTC m=+0.064629028 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 24 02:12:43 compute-0 nova_compute[186999]: 2025-11-24 02:12:43.500 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:47 compute-0 nova_compute[186999]: 2025-11-24 02:12:47.321 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:12:48.432 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:12:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:12:48.432 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:12:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:12:48.432 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:12:48 compute-0 nova_compute[186999]: 2025-11-24 02:12:48.502 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:48 compute-0 podman[224415]: 2025-11-24 02:12:48.811654735 +0000 UTC m=+0.064067990 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 24 02:12:52 compute-0 nova_compute[186999]: 2025-11-24 02:12:52.370 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:53 compute-0 nova_compute[186999]: 2025-11-24 02:12:53.544 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:53 compute-0 podman[224436]: 2025-11-24 02:12:53.797915284 +0000 UTC m=+0.054040160 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:12:57 compute-0 nova_compute[186999]: 2025-11-24 02:12:57.373 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:58 compute-0 nova_compute[186999]: 2025-11-24 02:12:58.550 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:12:58 compute-0 podman[224460]: 2025-11-24 02:12:58.811821057 +0000 UTC m=+0.057996725 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 02:13:02 compute-0 sshd-session[224479]: Received disconnect from 154.90.59.75 port 50210:11: Bye Bye [preauth]
Nov 24 02:13:02 compute-0 sshd-session[224479]: Disconnected from authenticating user root 154.90.59.75 port 50210 [preauth]
Nov 24 02:13:02 compute-0 nova_compute[186999]: 2025-11-24 02:13:02.408 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:03 compute-0 nova_compute[186999]: 2025-11-24 02:13:03.551 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:03 compute-0 podman[224482]: 2025-11-24 02:13:03.809795506 +0000 UTC m=+0.057442418 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 02:13:03 compute-0 podman[224483]: 2025-11-24 02:13:03.839443677 +0000 UTC m=+0.083904117 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:13:03 compute-0 podman[224481]: 2025-11-24 02:13:03.839712925 +0000 UTC m=+0.091336903 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 24 02:13:07 compute-0 nova_compute[186999]: 2025-11-24 02:13:07.446 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:08 compute-0 nova_compute[186999]: 2025-11-24 02:13:08.592 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:13:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:13:12 compute-0 nova_compute[186999]: 2025-11-24 02:13:12.450 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:13 compute-0 nova_compute[186999]: 2025-11-24 02:13:13.593 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:13 compute-0 podman[224549]: 2025-11-24 02:13:13.838926021 +0000 UTC m=+0.087093409 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 24 02:13:17 compute-0 nova_compute[186999]: 2025-11-24 02:13:17.489 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:18 compute-0 nova_compute[186999]: 2025-11-24 02:13:18.626 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:18 compute-0 nova_compute[186999]: 2025-11-24 02:13:18.773 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:19 compute-0 podman[224570]: 2025-11-24 02:13:19.80965555 +0000 UTC m=+0.060356743 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.534 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.772 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.784 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:13:22 compute-0 nova_compute[186999]: 2025-11-24 02:13:22.784 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:23 compute-0 nova_compute[186999]: 2025-11-24 02:13:23.671 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:24 compute-0 nova_compute[186999]: 2025-11-24 02:13:24.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:24 compute-0 nova_compute[186999]: 2025-11-24 02:13:24.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:24 compute-0 podman[224591]: 2025-11-24 02:13:24.79250098 +0000 UTC m=+0.048129709 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.800 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.801 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.801 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.801 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.950 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.951 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.4510612487793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.951 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:13:26 compute-0 nova_compute[186999]: 2025-11-24 02:13:26.951 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.030 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.031 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.056 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.070 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.071 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.072 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:13:27 compute-0 nova_compute[186999]: 2025-11-24 02:13:27.536 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:28 compute-0 nova_compute[186999]: 2025-11-24 02:13:28.673 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:29 compute-0 podman[224615]: 2025-11-24 02:13:29.042871958 +0000 UTC m=+0.047533991 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:13:30 compute-0 nova_compute[186999]: 2025-11-24 02:13:30.072 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:32 compute-0 nova_compute[186999]: 2025-11-24 02:13:32.540 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:32 compute-0 nova_compute[186999]: 2025-11-24 02:13:32.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:13:32 compute-0 nova_compute[186999]: 2025-11-24 02:13:32.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:13:33 compute-0 nova_compute[186999]: 2025-11-24 02:13:33.676 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:34 compute-0 podman[224633]: 2025-11-24 02:13:34.799664517 +0000 UTC m=+0.056760069 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 02:13:34 compute-0 podman[224634]: 2025-11-24 02:13:34.806459184 +0000 UTC m=+0.053846334 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:13:34 compute-0 podman[224635]: 2025-11-24 02:13:34.835871808 +0000 UTC m=+0.086004358 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 24 02:13:37 compute-0 nova_compute[186999]: 2025-11-24 02:13:37.543 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:38 compute-0 nova_compute[186999]: 2025-11-24 02:13:38.678 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:42 compute-0 nova_compute[186999]: 2025-11-24 02:13:42.546 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:43 compute-0 nova_compute[186999]: 2025-11-24 02:13:43.678 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:44 compute-0 podman[224699]: 2025-11-24 02:13:44.795777421 +0000 UTC m=+0.055325727 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:13:47 compute-0 nova_compute[186999]: 2025-11-24 02:13:47.548 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:13:48.433 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:13:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:13:48.434 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:13:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:13:48.434 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:13:48 compute-0 nova_compute[186999]: 2025-11-24 02:13:48.680 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:50 compute-0 podman[224719]: 2025-11-24 02:13:50.803168374 +0000 UTC m=+0.053645879 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Nov 24 02:13:52 compute-0 nova_compute[186999]: 2025-11-24 02:13:52.573 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:53 compute-0 nova_compute[186999]: 2025-11-24 02:13:53.680 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:55 compute-0 podman[224741]: 2025-11-24 02:13:55.790563618 +0000 UTC m=+0.046871662 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 02:13:57 compute-0 nova_compute[186999]: 2025-11-24 02:13:57.575 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:58 compute-0 nova_compute[186999]: 2025-11-24 02:13:58.682 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:13:59 compute-0 podman[224766]: 2025-11-24 02:13:59.79171309 +0000 UTC m=+0.043809233 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 24 02:14:02 compute-0 anacron[215686]: Job `cron.daily' started
Nov 24 02:14:02 compute-0 anacron[215686]: Job `cron.daily' terminated
Nov 24 02:14:02 compute-0 nova_compute[186999]: 2025-11-24 02:14:02.579 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:03 compute-0 sshd-session[224740]: Connection closed by authenticating user root 68.210.96.117 port 57532 [preauth]
Nov 24 02:14:03 compute-0 nova_compute[186999]: 2025-11-24 02:14:03.684 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:05 compute-0 podman[224788]: 2025-11-24 02:14:05.820543176 +0000 UTC m=+0.059243371 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:14:05 compute-0 podman[224787]: 2025-11-24 02:14:05.837614771 +0000 UTC m=+0.075685468 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true)
Nov 24 02:14:05 compute-0 podman[224789]: 2025-11-24 02:14:05.876999194 +0000 UTC m=+0.108202532 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 02:14:07 compute-0 nova_compute[186999]: 2025-11-24 02:14:07.581 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:08 compute-0 nova_compute[186999]: 2025-11-24 02:14:08.686 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:12 compute-0 nova_compute[186999]: 2025-11-24 02:14:12.584 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:13 compute-0 nova_compute[186999]: 2025-11-24 02:14:13.687 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:15 compute-0 sshd-session[224857]: Received disconnect from 154.90.59.75 port 45528:11: Bye Bye [preauth]
Nov 24 02:14:15 compute-0 sshd-session[224857]: Disconnected from authenticating user root 154.90.59.75 port 45528 [preauth]
Nov 24 02:14:15 compute-0 podman[224859]: 2025-11-24 02:14:15.800907033 +0000 UTC m=+0.055688128 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:14:17 compute-0 nova_compute[186999]: 2025-11-24 02:14:17.588 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:18 compute-0 nova_compute[186999]: 2025-11-24 02:14:18.688 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:19 compute-0 nova_compute[186999]: 2025-11-24 02:14:19.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:21 compute-0 podman[224879]: 2025-11-24 02:14:21.802290202 +0000 UTC m=+0.055805371 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 02:14:22 compute-0 nova_compute[186999]: 2025-11-24 02:14:22.590 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.690 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.771 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.772 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.781 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 02:14:23 compute-0 nova_compute[186999]: 2025-11-24 02:14:23.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:24 compute-0 nova_compute[186999]: 2025-11-24 02:14:24.771 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.767 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.782 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.783 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:26 compute-0 podman[224900]: 2025-11-24 02:14:26.790620062 +0000 UTC m=+0.046116700 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.804 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.805 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.805 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.934 187003 WARNING nova.virt.libvirt.driver [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.935 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5758MB free_disk=73.45107650756836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.935 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.935 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.985 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 02:14:26 compute-0 nova_compute[186999]: 2025-11-24 02:14:26.986 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 02:14:27 compute-0 nova_compute[186999]: 2025-11-24 02:14:27.006 187003 DEBUG nova.compute.provider_tree [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed in ProviderTree for provider: f28f14d1-2972-450a-b67e-0899e7918234 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 02:14:27 compute-0 nova_compute[186999]: 2025-11-24 02:14:27.018 187003 DEBUG nova.scheduler.client.report [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Inventory has not changed for provider f28f14d1-2972-450a-b67e-0899e7918234 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 02:14:27 compute-0 nova_compute[186999]: 2025-11-24 02:14:27.019 187003 DEBUG nova.compute.resource_tracker [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 02:14:27 compute-0 nova_compute[186999]: 2025-11-24 02:14:27.019 187003 DEBUG oslo_concurrency.lockutils [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:14:27 compute-0 nova_compute[186999]: 2025-11-24 02:14:27.593 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:28 compute-0 nova_compute[186999]: 2025-11-24 02:14:28.018 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:28 compute-0 nova_compute[186999]: 2025-11-24 02:14:28.691 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:29 compute-0 nova_compute[186999]: 2025-11-24 02:14:29.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:30 compute-0 podman[224925]: 2025-11-24 02:14:30.815444015 +0000 UTC m=+0.064022284 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 02:14:32 compute-0 nova_compute[186999]: 2025-11-24 02:14:32.596 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:32 compute-0 nova_compute[186999]: 2025-11-24 02:14:32.770 187003 DEBUG oslo_service.periodic_task [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 02:14:32 compute-0 nova_compute[186999]: 2025-11-24 02:14:32.770 187003 DEBUG nova.compute.manager [None req-c4ee0022-1ff1-4195-a6bc-bf0795a7fe35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 02:14:33 compute-0 nova_compute[186999]: 2025-11-24 02:14:33.694 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:36 compute-0 podman[224945]: 2025-11-24 02:14:36.817542859 +0000 UTC m=+0.064135657 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 02:14:36 compute-0 podman[224944]: 2025-11-24 02:14:36.837701546 +0000 UTC m=+0.088968465 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 02:14:36 compute-0 podman[224946]: 2025-11-24 02:14:36.867721001 +0000 UTC m=+0.113447824 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 02:14:37 compute-0 nova_compute[186999]: 2025-11-24 02:14:37.597 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:38 compute-0 nova_compute[186999]: 2025-11-24 02:14:38.694 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:42 compute-0 nova_compute[186999]: 2025-11-24 02:14:42.600 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:43 compute-0 nova_compute[186999]: 2025-11-24 02:14:43.696 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:46 compute-0 podman[225012]: 2025-11-24 02:14:46.822822059 +0000 UTC m=+0.068345546 container health_status 735abc2eee02b322937beb41c2f243ceab9723b900808b110a6240577f41dc96 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 02:14:47 compute-0 nova_compute[186999]: 2025-11-24 02:14:47.604 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:14:48.434 104238 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 02:14:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:14:48.435 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 02:14:48 compute-0 ovn_metadata_agent[104233]: 2025-11-24 02:14:48.435 104238 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 02:14:48 compute-0 nova_compute[186999]: 2025-11-24 02:14:48.699 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:52 compute-0 nova_compute[186999]: 2025-11-24 02:14:52.606 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:52 compute-0 podman[225032]: 2025-11-24 02:14:52.825865578 +0000 UTC m=+0.086235889 container health_status 47f67cdd53623769e29607c584c0f8d623a7ff67022a787c87ec4cddf03af3c5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 02:14:53 compute-0 nova_compute[186999]: 2025-11-24 02:14:53.701 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:57 compute-0 nova_compute[186999]: 2025-11-24 02:14:57.609 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:14:57 compute-0 podman[225054]: 2025-11-24 02:14:57.796107795 +0000 UTC m=+0.053422634 container health_status b4425cde2c9e58c83010bcc9469e2ff23ecbfd57dc8ac4e6a1962672af235213 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 02:14:58 compute-0 nova_compute[186999]: 2025-11-24 02:14:58.703 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:01 compute-0 podman[225078]: 2025-11-24 02:15:01.791874031 +0000 UTC m=+0.044150014 container health_status ef7355250f77fae19c9874f66aaeb5361e2a4f1cc77235116779f4f61cfb3c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 02:15:02 compute-0 sshd-session[225097]: Accepted publickey for zuul from 192.168.122.10 port 36544 ssh2: ECDSA SHA256:NGjj0J617jZVM245GcZAL8wwtc0iOZtAioQFxLsO1oE
Nov 24 02:15:02 compute-0 systemd-logind[791]: New session 29 of user zuul.
Nov 24 02:15:02 compute-0 systemd[1]: Started Session 29 of User zuul.
Nov 24 02:15:02 compute-0 sshd-session[225097]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 02:15:02 compute-0 nova_compute[186999]: 2025-11-24 02:15:02.610 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:02 compute-0 sudo[225101]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 02:15:02 compute-0 sudo[225101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 02:15:03 compute-0 nova_compute[186999]: 2025-11-24 02:15:03.705 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:07 compute-0 ovs-vsctl[225273]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 02:15:07 compute-0 nova_compute[186999]: 2025-11-24 02:15:07.612 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:07 compute-0 podman[225321]: 2025-11-24 02:15:07.823470925 +0000 UTC m=+0.061663948 container health_status 82746fea1aef0d202cfddf5b39734d00c179308e135979f896d51a390677ec6b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 02:15:07 compute-0 podman[225320]: 2025-11-24 02:15:07.835239526 +0000 UTC m=+0.068206161 container health_status 493d0b927eac478492f5a6fc3221ce572ed7f60d030b37bb07e361676301d8b6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 02:15:07 compute-0 podman[225322]: 2025-11-24 02:15:07.857802532 +0000 UTC m=+0.095589393 container health_status c8e3bb0afaa231ef13aabd21a1c4d536c69b5810eecca218d4d8a314a082f5c4 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 02:15:08 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 02:15:08 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 02:15:08 compute-0 virtqemud[186602]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 02:15:08 compute-0 nova_compute[186999]: 2025-11-24 02:15:08.707 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:09 compute-0 crontab[225748]: (root) LIST (root)
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 ceilometer_agent_compute[197695]: 2025-11-24 02:15:11.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 02:15:11 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 02:15:11 compute-0 systemd[1]: Started Hostname Service.
Nov 24 02:15:12 compute-0 nova_compute[186999]: 2025-11-24 02:15:12.613 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 02:15:13 compute-0 nova_compute[186999]: 2025-11-24 02:15:13.708 187003 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
